Gauss elimination vs LU factorization












0












$begingroup$


I wanted to know which of the two methods gives more accurate results when solving linear equations problems, Gauss elimination or LU factorization? And what happens if we use partial pivoting while applying the first method, is this more accurate than the LU factorization?










share|cite|improve this question









$endgroup$








  • 1




    $begingroup$
    Gaussian Elimination without partial pivoting gives exactly a LU decomposition. With partial pivoting it gives a PLU decomposition, where P is a permutation matrix. Pivoting will be more stable in general.
    $endgroup$
    – tch
    Dec 25 '18 at 16:26


















0












$begingroup$


I wanted to know which of the two methods gives more accurate results when solving linear equations problems, Gauss elimination or LU factorization? And what happens if we use partial pivoting while applying the first method, is this more accurate than the LU factorization?










share|cite|improve this question









$endgroup$








  • 1




    $begingroup$
    Gaussian Elimination without partial pivoting gives exactly a LU decomposition. With partial pivoting it gives a PLU decomposition, where P is a permutation matrix. Pivoting will be more stable in general.
    $endgroup$
    – tch
    Dec 25 '18 at 16:26
















0












0








0





$begingroup$


I wanted to know which of the two methods gives more accurate results when solving linear equations problems, Gauss elimination or LU factorization? And what happens if we use partial pivoting while applying the first method, is this more accurate than the LU factorization?










share|cite|improve this question









$endgroup$




I wanted to know which of the two methods gives more accurate results when solving linear equations problems, Gauss elimination or LU factorization? And what happens if we use partial pivoting while applying the first method, is this more accurate than the LU factorization?







linear-algebra numerical-methods






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Dec 25 '18 at 15:56









Gandalf_the_WhiteGandalf_the_White

11




11








  • 1




    $begingroup$
    Gaussian Elimination without partial pivoting gives exactly a LU decomposition. With partial pivoting it gives a PLU decomposition, where P is a permutation matrix. Pivoting will be more stable in general.
    $endgroup$
    – tch
    Dec 25 '18 at 16:26
















  • 1




    $begingroup$
    Gaussian Elimination without partial pivoting gives exactly a LU decomposition. With partial pivoting it gives a PLU decomposition, where P is a permutation matrix. Pivoting will be more stable in general.
    $endgroup$
    – tch
    Dec 25 '18 at 16:26










1




1




$begingroup$
Gaussian Elimination without partial pivoting gives exactly a LU decomposition. With partial pivoting it gives a PLU decomposition, where P is a permutation matrix. Pivoting will be more stable in general.
$endgroup$
– tch
Dec 25 '18 at 16:26






$begingroup$
Gaussian Elimination without partial pivoting gives exactly a LU decomposition. With partial pivoting it gives a PLU decomposition, where P is a permutation matrix. Pivoting will be more stable in general.
$endgroup$
– tch
Dec 25 '18 at 16:26












1 Answer
1






active

oldest

votes


















1












$begingroup$

I think the book Numerical Linear Algebra by Trefethen and Bau has a pretty good explanation of this topic. I've summarized some of the content from chapters 20 and 21.



Gaussian Elimination is unstable



Consider the matrix
$$
A = begin{bmatrix} 10^{-20} & 1 \ 1 & 1end{bmatrix}
$$



This has exact LU decomposition,
$$
L = begin{bmatrix} 1 & 0 \ 10^{20} & 1end{bmatrix},~~~~
U = begin{bmatrix} 10^{-20} & 1 \ 0 & 1-10^{20}end{bmatrix}
$$



However, suppose we make small (relative) rounding errors and end up representing $1-10^{20}$ as $-10^{20}$. Then,
$$
begin{bmatrix} 1 & 0 \ 10^{20} & 1end{bmatrix}begin{bmatrix} 10^{-20} & 1 \ 0 & -10^{20}end{bmatrix}
=
begin{bmatrix} 10^{-20} & 1 \ 1 & 0 end{bmatrix}
$$

which is far from $A$.



LU is Gaussian Elimination without Pivoting



To expand on my comment, when you do Gaussian Elimination without pivoting, each step can be written in terms of matrix multiplication on the left by a lower triangular matrix. The end result is an upper triangular matrix so written in matrix form Gaussian elimination looks something like
$$
L_{m-1}cdots L_2L_1 A = U
$$



The product of lower triangular matrices is still lower triangular, as is the inverse, so setting $L^{-1} = L_{m-1}cdots L_2L_1$ gives the LU decomposition.



As a concrete example, let's consider the matrix
$$
A = begin{bmatrix}
2 & 1 & 1\
4 & 3 & 2\
8 & 7 & 9
end{bmatrix}
$$



The first step of Gaussian elimination would be to subtract twice the first row from the second, and 4 times the first from the third. We can write this as,
$$
L_1 =
begin{bmatrix}
1 \
-2 & 1\
-4 & & 1
end{bmatrix}
$$

You can verify that,
$$
L_1A =
begin{bmatrix}
2 & 1 & 1 \
0 & 1 & 0 \
0 & 3 & 5
end{bmatrix}
$$



The next step of Gaussian elimination is to subtract 3 times the second row from the third row. Again, we can write this as,
$$
L_2 =
begin{bmatrix}
1 \
& 1 \
& -3 & 1
end{bmatrix}
$$

and verify that,
$$
L_2(L_1A) =
begin{bmatrix}
2 & 1 & 1 \
0 & 1 & 0 \
0 & 0 & 5
end{bmatrix}
$$



It turns out that the invese of $L_i$ is quite simple, you just negate the entries below the main diagonal. Similarly, the product of such matrices is also simple, you just add the entries below the diagonal. You can again verify that,
$$
L = (L_2L_1)^{-1} = L_1^{-1}L_2^{-1}
=
begin{bmatrix}
1 \
2 & 1\
4 & & 1
end{bmatrix}
begin{bmatrix}
1 \
& 1 \
& 3 & 1
end{bmatrix}
=
begin{bmatrix}
1 \
2 & 1 \
4 & 3 & 1
end{bmatrix}
$$



PLU is Gaussian Elimination with Pivoting



Now, if you pivot at each step, you can view this as swapping the rows of the working matrix. This is the same as left multiplication by a permutation matrix. So doing gaussian elimination with pivoting will give a set of operations like
$$
L_{m-1}P_{m-1}cdots L_2P_2L_1P_1 A = U
$$



If we define,
$$
L_k' = P_{m-1}cdots P_{k+1}L_kP_{k+1}^{-1}cdots P_{m-1}^{-1}
$$

you will see we can write the above series of operations as
$$
(L_{m-1}'cdots L_2'L_1')(P_{m-1}^{-1}cdots P_1^{-1}) A = U
$$



Which you can rearrange to get a PLU decomposition.






share|cite|improve this answer











$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3052219%2fgauss-elimination-vs-lu-factorization%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    I think the book Numerical Linear Algebra by Trefethen and Bau has a pretty good explanation of this topic. I've summarized some of the content from chapters 20 and 21.



    Gaussian Elimination is unstable



    Consider the matrix
    $$
    A = begin{bmatrix} 10^{-20} & 1 \ 1 & 1end{bmatrix}
    $$



    This has exact LU decomposition,
    $$
    L = begin{bmatrix} 1 & 0 \ 10^{20} & 1end{bmatrix},~~~~
    U = begin{bmatrix} 10^{-20} & 1 \ 0 & 1-10^{20}end{bmatrix}
    $$



    However, suppose we make small (relative) rounding errors and end up representing $1-10^{20}$ as $-10^{20}$. Then,
    $$
    begin{bmatrix} 1 & 0 \ 10^{20} & 1end{bmatrix}begin{bmatrix} 10^{-20} & 1 \ 0 & -10^{20}end{bmatrix}
    =
    begin{bmatrix} 10^{-20} & 1 \ 1 & 0 end{bmatrix}
    $$

    which is far from $A$.



    LU is Gaussian Elimination without Pivoting



    To expand on my comment, when you do Gaussian Elimination without pivoting, each step can be written in terms of matrix multiplication on the left by a lower triangular matrix. The end result is an upper triangular matrix so written in matrix form Gaussian elimination looks something like
    $$
    L_{m-1}cdots L_2L_1 A = U
    $$



    The product of lower triangular matrices is still lower triangular, as is the inverse, so setting $L^{-1} = L_{m-1}cdots L_2L_1$ gives the LU decomposition.



    As a concrete example, let's consider the matrix
    $$
    A = begin{bmatrix}
    2 & 1 & 1\
    4 & 3 & 2\
    8 & 7 & 9
    end{bmatrix}
    $$



    The first step of Gaussian elimination would be to subtract twice the first row from the second, and 4 times the first from the third. We can write this as,
    $$
    L_1 =
    begin{bmatrix}
    1 \
    -2 & 1\
    -4 & & 1
    end{bmatrix}
    $$

    You can verify that,
    $$
    L_1A =
    begin{bmatrix}
    2 & 1 & 1 \
    0 & 1 & 0 \
    0 & 3 & 5
    end{bmatrix}
    $$



    The next step of Gaussian elimination is to subtract 3 times the second row from the third row. Again, we can write this as,
    $$
    L_2 =
    begin{bmatrix}
    1 \
    & 1 \
    & -3 & 1
    end{bmatrix}
    $$

    and verify that,
    $$
    L_2(L_1A) =
    begin{bmatrix}
    2 & 1 & 1 \
    0 & 1 & 0 \
    0 & 0 & 5
    end{bmatrix}
    $$



    It turns out that the invese of $L_i$ is quite simple, you just negate the entries below the main diagonal. Similarly, the product of such matrices is also simple, you just add the entries below the diagonal. You can again verify that,
    $$
    L = (L_2L_1)^{-1} = L_1^{-1}L_2^{-1}
    =
    begin{bmatrix}
    1 \
    2 & 1\
    4 & & 1
    end{bmatrix}
    begin{bmatrix}
    1 \
    & 1 \
    & 3 & 1
    end{bmatrix}
    =
    begin{bmatrix}
    1 \
    2 & 1 \
    4 & 3 & 1
    end{bmatrix}
    $$



    PLU is Gaussian Elimination with Pivoting



    Now, if you pivot at each step, you can view this as swapping the rows of the working matrix. This is the same as left multiplication by a permutation matrix. So doing gaussian elimination with pivoting will give a set of operations like
    $$
    L_{m-1}P_{m-1}cdots L_2P_2L_1P_1 A = U
    $$



    If we define,
    $$
    L_k' = P_{m-1}cdots P_{k+1}L_kP_{k+1}^{-1}cdots P_{m-1}^{-1}
    $$

    you will see we can write the above series of operations as
    $$
    (L_{m-1}'cdots L_2'L_1')(P_{m-1}^{-1}cdots P_1^{-1}) A = U
    $$



    Which you can rearrange to get a PLU decomposition.






    share|cite|improve this answer











    $endgroup$


















      1












      $begingroup$

      I think the book Numerical Linear Algebra by Trefethen and Bau has a pretty good explanation of this topic. I've summarized some of the content from chapters 20 and 21.



      Gaussian Elimination is unstable



      Consider the matrix
      $$
      A = begin{bmatrix} 10^{-20} & 1 \ 1 & 1end{bmatrix}
      $$



      This has exact LU decomposition,
      $$
      L = begin{bmatrix} 1 & 0 \ 10^{20} & 1end{bmatrix},~~~~
      U = begin{bmatrix} 10^{-20} & 1 \ 0 & 1-10^{20}end{bmatrix}
      $$



      However, suppose we make small (relative) rounding errors and end up representing $1-10^{20}$ as $-10^{20}$. Then,
      $$
      begin{bmatrix} 1 & 0 \ 10^{20} & 1end{bmatrix}begin{bmatrix} 10^{-20} & 1 \ 0 & -10^{20}end{bmatrix}
      =
      begin{bmatrix} 10^{-20} & 1 \ 1 & 0 end{bmatrix}
      $$

      which is far from $A$.



      LU is Gaussian Elimination without Pivoting



      To expand on my comment, when you do Gaussian Elimination without pivoting, each step can be written in terms of matrix multiplication on the left by a lower triangular matrix. The end result is an upper triangular matrix so written in matrix form Gaussian elimination looks something like
      $$
      L_{m-1}cdots L_2L_1 A = U
      $$



      The product of lower triangular matrices is still lower triangular, as is the inverse, so setting $L^{-1} = L_{m-1}cdots L_2L_1$ gives the LU decomposition.



      As a concrete example, let's consider the matrix
      $$
      A = begin{bmatrix}
      2 & 1 & 1\
      4 & 3 & 2\
      8 & 7 & 9
      end{bmatrix}
      $$



      The first step of Gaussian elimination would be to subtract twice the first row from the second, and 4 times the first from the third. We can write this as,
      $$
      L_1 =
      begin{bmatrix}
      1 \
      -2 & 1\
      -4 & & 1
      end{bmatrix}
      $$

      You can verify that,
      $$
      L_1A =
      begin{bmatrix}
      2 & 1 & 1 \
      0 & 1 & 0 \
      0 & 3 & 5
      end{bmatrix}
      $$



      The next step of Gaussian elimination is to subtract 3 times the second row from the third row. Again, we can write this as,
      $$
      L_2 =
      begin{bmatrix}
      1 \
      & 1 \
      & -3 & 1
      end{bmatrix}
      $$

      and verify that,
      $$
      L_2(L_1A) =
      begin{bmatrix}
      2 & 1 & 1 \
      0 & 1 & 0 \
      0 & 0 & 5
      end{bmatrix}
      $$



      It turns out that the invese of $L_i$ is quite simple, you just negate the entries below the main diagonal. Similarly, the product of such matrices is also simple, you just add the entries below the diagonal. You can again verify that,
      $$
      L = (L_2L_1)^{-1} = L_1^{-1}L_2^{-1}
      =
      begin{bmatrix}
      1 \
      2 & 1\
      4 & & 1
      end{bmatrix}
      begin{bmatrix}
      1 \
      & 1 \
      & 3 & 1
      end{bmatrix}
      =
      begin{bmatrix}
      1 \
      2 & 1 \
      4 & 3 & 1
      end{bmatrix}
      $$



      PLU is Gaussian Elimination with Pivoting



      Now, if you pivot at each step, you can view this as swapping the rows of the working matrix. This is the same as left multiplication by a permutation matrix. So doing gaussian elimination with pivoting will give a set of operations like
      $$
      L_{m-1}P_{m-1}cdots L_2P_2L_1P_1 A = U
      $$



      If we define,
      $$
      L_k' = P_{m-1}cdots P_{k+1}L_kP_{k+1}^{-1}cdots P_{m-1}^{-1}
      $$

      you will see we can write the above series of operations as
      $$
      (L_{m-1}'cdots L_2'L_1')(P_{m-1}^{-1}cdots P_1^{-1}) A = U
      $$



      Which you can rearrange to get a PLU decomposition.






      share|cite|improve this answer











      $endgroup$
















        1












        1








        1





        $begingroup$

        I think the book Numerical Linear Algebra by Trefethen and Bau has a pretty good explanation of this topic. I've summarized some of the content from chapters 20 and 21.



        Gaussian Elimination is unstable



        Consider the matrix
        $$
        A = begin{bmatrix} 10^{-20} & 1 \ 1 & 1end{bmatrix}
        $$



        This has exact LU decomposition,
        $$
        L = begin{bmatrix} 1 & 0 \ 10^{20} & 1end{bmatrix},~~~~
        U = begin{bmatrix} 10^{-20} & 1 \ 0 & 1-10^{20}end{bmatrix}
        $$



        However, suppose we make small (relative) rounding errors and end up representing $1-10^{20}$ as $-10^{20}$. Then,
        $$
        begin{bmatrix} 1 & 0 \ 10^{20} & 1end{bmatrix}begin{bmatrix} 10^{-20} & 1 \ 0 & -10^{20}end{bmatrix}
        =
        begin{bmatrix} 10^{-20} & 1 \ 1 & 0 end{bmatrix}
        $$

        which is far from $A$.



        LU is Gaussian Elimination without Pivoting



        To expand on my comment, when you do Gaussian Elimination without pivoting, each step can be written in terms of matrix multiplication on the left by a lower triangular matrix. The end result is an upper triangular matrix so written in matrix form Gaussian elimination looks something like
        $$
        L_{m-1}cdots L_2L_1 A = U
        $$



        The product of lower triangular matrices is still lower triangular, as is the inverse, so setting $L^{-1} = L_{m-1}cdots L_2L_1$ gives the LU decomposition.



        As a concrete example, let's consider the matrix
        $$
        A = begin{bmatrix}
        2 & 1 & 1\
        4 & 3 & 2\
        8 & 7 & 9
        end{bmatrix}
        $$



        The first step of Gaussian elimination would be to subtract twice the first row from the second, and 4 times the first from the third. We can write this as,
        $$
        L_1 =
        begin{bmatrix}
        1 \
        -2 & 1\
        -4 & & 1
        end{bmatrix}
        $$

        You can verify that,
        $$
        L_1A =
        begin{bmatrix}
        2 & 1 & 1 \
        0 & 1 & 0 \
        0 & 3 & 5
        end{bmatrix}
        $$



        The next step of Gaussian elimination is to subtract 3 times the second row from the third row. Again, we can write this as,
        $$
        L_2 =
        begin{bmatrix}
        1 \
        & 1 \
        & -3 & 1
        end{bmatrix}
        $$

        and verify that,
        $$
        L_2(L_1A) =
        begin{bmatrix}
        2 & 1 & 1 \
        0 & 1 & 0 \
        0 & 0 & 5
        end{bmatrix}
        $$



        It turns out that the invese of $L_i$ is quite simple, you just negate the entries below the main diagonal. Similarly, the product of such matrices is also simple, you just add the entries below the diagonal. You can again verify that,
        $$
        L = (L_2L_1)^{-1} = L_1^{-1}L_2^{-1}
        =
        begin{bmatrix}
        1 \
        2 & 1\
        4 & & 1
        end{bmatrix}
        begin{bmatrix}
        1 \
        & 1 \
        & 3 & 1
        end{bmatrix}
        =
        begin{bmatrix}
        1 \
        2 & 1 \
        4 & 3 & 1
        end{bmatrix}
        $$



        PLU is Gaussian Elimination with Pivoting



        Now, if you pivot at each step, you can view this as swapping the rows of the working matrix. This is the same as left multiplication by a permutation matrix. So doing gaussian elimination with pivoting will give a set of operations like
        $$
        L_{m-1}P_{m-1}cdots L_2P_2L_1P_1 A = U
        $$



        If we define,
        $$
        L_k' = P_{m-1}cdots P_{k+1}L_kP_{k+1}^{-1}cdots P_{m-1}^{-1}
        $$

        you will see we can write the above series of operations as
        $$
        (L_{m-1}'cdots L_2'L_1')(P_{m-1}^{-1}cdots P_1^{-1}) A = U
        $$



        Which you can rearrange to get a PLU decomposition.






        share|cite|improve this answer











        $endgroup$



        I think the book Numerical Linear Algebra by Trefethen and Bau has a pretty good explanation of this topic. I've summarized some of the content from chapters 20 and 21.



        Gaussian Elimination is unstable



        Consider the matrix
        $$
        A = begin{bmatrix} 10^{-20} & 1 \ 1 & 1end{bmatrix}
        $$



        This has exact LU decomposition,
        $$
        L = begin{bmatrix} 1 & 0 \ 10^{20} & 1end{bmatrix},~~~~
        U = begin{bmatrix} 10^{-20} & 1 \ 0 & 1-10^{20}end{bmatrix}
        $$



        However, suppose we make small (relative) rounding errors and end up representing $1-10^{20}$ as $-10^{20}$. Then,
        $$
        begin{bmatrix} 1 & 0 \ 10^{20} & 1end{bmatrix}begin{bmatrix} 10^{-20} & 1 \ 0 & -10^{20}end{bmatrix}
        =
        begin{bmatrix} 10^{-20} & 1 \ 1 & 0 end{bmatrix}
        $$

        which is far from $A$.



        LU is Gaussian Elimination without Pivoting



        To expand on my comment, when you do Gaussian Elimination without pivoting, each step can be written in terms of matrix multiplication on the left by a lower triangular matrix. The end result is an upper triangular matrix so written in matrix form Gaussian elimination looks something like
        $$
        L_{m-1}cdots L_2L_1 A = U
        $$



        The product of lower triangular matrices is still lower triangular, as is the inverse, so setting $L^{-1} = L_{m-1}cdots L_2L_1$ gives the LU decomposition.



        As a concrete example, let's consider the matrix
        $$
        A = begin{bmatrix}
        2 & 1 & 1\
        4 & 3 & 2\
        8 & 7 & 9
        end{bmatrix}
        $$



        The first step of Gaussian elimination would be to subtract twice the first row from the second, and 4 times the first from the third. We can write this as,
        $$
        L_1 =
        begin{bmatrix}
        1 \
        -2 & 1\
        -4 & & 1
        end{bmatrix}
        $$

        You can verify that,
        $$
        L_1A =
        begin{bmatrix}
        2 & 1 & 1 \
        0 & 1 & 0 \
        0 & 3 & 5
        end{bmatrix}
        $$



        The next step of Gaussian elimination is to subtract 3 times the second row from the third row. Again, we can write this as,
        $$
        L_2 =
        begin{bmatrix}
        1 \
        & 1 \
        & -3 & 1
        end{bmatrix}
        $$

        and verify that,
        $$
        L_2(L_1A) =
        begin{bmatrix}
        2 & 1 & 1 \
        0 & 1 & 0 \
        0 & 0 & 5
        end{bmatrix}
        $$



        It turns out that the invese of $L_i$ is quite simple, you just negate the entries below the main diagonal. Similarly, the product of such matrices is also simple, you just add the entries below the diagonal. You can again verify that,
        $$
        L = (L_2L_1)^{-1} = L_1^{-1}L_2^{-1}
        =
        begin{bmatrix}
        1 \
        2 & 1\
        4 & & 1
        end{bmatrix}
        begin{bmatrix}
        1 \
        & 1 \
        & 3 & 1
        end{bmatrix}
        =
        begin{bmatrix}
        1 \
        2 & 1 \
        4 & 3 & 1
        end{bmatrix}
        $$



        PLU is Gaussian Elimination with Pivoting



        Now, if you pivot at each step, you can view this as swapping the rows of the working matrix. This is the same as left multiplication by a permutation matrix. So doing gaussian elimination with pivoting will give a set of operations like
        $$
        L_{m-1}P_{m-1}cdots L_2P_2L_1P_1 A = U
        $$



        If we define,
        $$
        L_k' = P_{m-1}cdots P_{k+1}L_kP_{k+1}^{-1}cdots P_{m-1}^{-1}
        $$

        you will see we can write the above series of operations as
        $$
        (L_{m-1}'cdots L_2'L_1')(P_{m-1}^{-1}cdots P_1^{-1}) A = U
        $$



        Which you can rearrange to get a PLU decomposition.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Dec 25 '18 at 16:58

























        answered Dec 25 '18 at 16:38









        tchtch

        803310




        803310






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3052219%2fgauss-elimination-vs-lu-factorization%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Quarter-circle Tiles

            build a pushdown automaton that recognizes the reverse language of a given pushdown automaton?

            Mont Emei