Matrix representation of Mixed derivatives











up vote
2
down vote

favorite












Imagine we have the problem



begin{cases}
-frac{d^2u}{dx^2} = f(x), x in [0,L] \
u(0) = 0 \
u(L) = 0
end{cases}



We know that we can approximate the second derivative using this formula:$$ frac{d^2u(x)}{dx^2} approx frac{u(x+h)-2u(x)+u(x-h)}{h^2} $$



if we define $u_{k} := u(x_{k}) $; $ x_{k} = kh $ and $ k = 0,1,2,...,N$. $h$ is known as the mesh size or step size. We get:



$$ frac{d^2u_{k}}{dx^2} approx frac{u_{k+1}-2u_{k}+u_{k-1}}{h^2} = frac{u_{k-1}-2u_{k}+u_{k+1}}{h^2} $$ for $k=1,...,N-1$



Since $u(0) = u_{0} = 0$ and $ u(L) = u(x_{N}) = u_{N} = 0 $ we get the following matrix representation of the second derivative operator



begin{equation}
frac{d^2}{dx^2} approx L_{2} = frac{1}{h^2}left(begin{matrix}
-2 & 1 & & 0\
1 & ddots & ddots & \
& ddots & ddots & 1 \
0 & & 1 & -2 end{matrix} right)
end{equation}



Then to approximate the solution of the differental equation we solve the system:



$$-L_{2}hat{u} = hat{f}$$
where $ hat{f} = [ f(x_{1}) f(x_{2}) ... f(x_{N-1}) ]^T $ and $hat{u} = [ u_{1} ... u_{N-1} ]^T$



The matrix representation of the laplacian operator using the Kronecker product is: $$ Delta =nabla^2 = frac{partial^2}{partial x^2} + frac{partial ^2}{partial y^2} = L_{2}otimes I_{n_{x}} + I_{n_{y}} otimes L_{2} $$



$textbf{Note that I have used $L_{2}$ and kronecker product to get a matrix representation} $ of the laplacian operator.



With this in mind. I want to approximate the first derivative using central difference:



$$ frac{du_{k}}{dx} approx frac{ u_{k+1}-u_{k-1} }{ 2h } = frac{ -u_{k-1} +u_{k+1} }{ 2h } $$



Since $u(0) = u_{0} = 0$ and $ u(L) = u(x_{N}) = u_{N} = 0 $ we get the following matrix representation of the first derivative



begin{equation}
frac{d}{dx} approx L_{1} = frac{1}{2h}left(begin{matrix}
0 & 1 & & 0\
-1 & ddots & ddots & \
& ddots & ddots & 1 \
0 & & -1 & 0 end{matrix} right)
end{equation}



I want to use $L_{1}$ and kronecker product to get the matrix representation of



$$ frac{partial^2 }{partial y partial x } $$
and
$$ frac{partial^2 }{partial x partial y } $$



$textbf{My questions are:} $




  1. Is this possible?


  2. if question 1. is affirmative, what is the matrix representation of the mixed derivatives using $L_{1}$ and kronecker product?


  3. if question 1. is negative, how can we get the matrix representation of the mixed derivatives using a simple matrix and kronecker product?


  4. Do you know a book or document( article or other ) that explain in detail this or something similar?



$textbf{I want to use kronecker product because it is fast and easy to implement in} $ matlab or octave.



By the way I tried to use this formula



$$ frac{partial^2u_{k,j} }{partial x partial y } = frac{ u_{k+1,j+1}+u_{k-1,j-1}-u_{k+1,j-1}-u_{k-1,j+1} }{4h^2} $$



But it was hard to see a pattern. Thank you!










share|cite|improve this question




























    up vote
    2
    down vote

    favorite












    Imagine we have the problem



    begin{cases}
    -frac{d^2u}{dx^2} = f(x), x in [0,L] \
    u(0) = 0 \
    u(L) = 0
    end{cases}



    We know that we can approximate the second derivative using this formula:$$ frac{d^2u(x)}{dx^2} approx frac{u(x+h)-2u(x)+u(x-h)}{h^2} $$



    if we define $u_{k} := u(x_{k}) $; $ x_{k} = kh $ and $ k = 0,1,2,...,N$. $h$ is known as the mesh size or step size. We get:



    $$ frac{d^2u_{k}}{dx^2} approx frac{u_{k+1}-2u_{k}+u_{k-1}}{h^2} = frac{u_{k-1}-2u_{k}+u_{k+1}}{h^2} $$ for $k=1,...,N-1$



    Since $u(0) = u_{0} = 0$ and $ u(L) = u(x_{N}) = u_{N} = 0 $ we get the following matrix representation of the second derivative operator



    begin{equation}
    frac{d^2}{dx^2} approx L_{2} = frac{1}{h^2}left(begin{matrix}
    -2 & 1 & & 0\
    1 & ddots & ddots & \
    & ddots & ddots & 1 \
    0 & & 1 & -2 end{matrix} right)
    end{equation}



    Then to approximate the solution of the differental equation we solve the system:



    $$-L_{2}hat{u} = hat{f}$$
    where $ hat{f} = [ f(x_{1}) f(x_{2}) ... f(x_{N-1}) ]^T $ and $hat{u} = [ u_{1} ... u_{N-1} ]^T$



    The matrix representation of the laplacian operator using the Kronecker product is: $$ Delta =nabla^2 = frac{partial^2}{partial x^2} + frac{partial ^2}{partial y^2} = L_{2}otimes I_{n_{x}} + I_{n_{y}} otimes L_{2} $$



    $textbf{Note that I have used $L_{2}$ and kronecker product to get a matrix representation} $ of the laplacian operator.



    With this in mind. I want to approximate the first derivative using central difference:



    $$ frac{du_{k}}{dx} approx frac{ u_{k+1}-u_{k-1} }{ 2h } = frac{ -u_{k-1} +u_{k+1} }{ 2h } $$



    Since $u(0) = u_{0} = 0$ and $ u(L) = u(x_{N}) = u_{N} = 0 $ we get the following matrix representation of the first derivative



    begin{equation}
    frac{d}{dx} approx L_{1} = frac{1}{2h}left(begin{matrix}
    0 & 1 & & 0\
    -1 & ddots & ddots & \
    & ddots & ddots & 1 \
    0 & & -1 & 0 end{matrix} right)
    end{equation}



    I want to use $L_{1}$ and kronecker product to get the matrix representation of



    $$ frac{partial^2 }{partial y partial x } $$
    and
    $$ frac{partial^2 }{partial x partial y } $$



    $textbf{My questions are:} $




    1. Is this possible?


    2. if question 1. is affirmative, what is the matrix representation of the mixed derivatives using $L_{1}$ and kronecker product?


    3. if question 1. is negative, how can we get the matrix representation of the mixed derivatives using a simple matrix and kronecker product?


    4. Do you know a book or document( article or other ) that explain in detail this or something similar?



    $textbf{I want to use kronecker product because it is fast and easy to implement in} $ matlab or octave.



    By the way I tried to use this formula



    $$ frac{partial^2u_{k,j} }{partial x partial y } = frac{ u_{k+1,j+1}+u_{k-1,j-1}-u_{k+1,j-1}-u_{k-1,j+1} }{4h^2} $$



    But it was hard to see a pattern. Thank you!










    share|cite|improve this question


























      up vote
      2
      down vote

      favorite









      up vote
      2
      down vote

      favorite











      Imagine we have the problem



      begin{cases}
      -frac{d^2u}{dx^2} = f(x), x in [0,L] \
      u(0) = 0 \
      u(L) = 0
      end{cases}



      We know that we can approximate the second derivative using this formula:$$ frac{d^2u(x)}{dx^2} approx frac{u(x+h)-2u(x)+u(x-h)}{h^2} $$



      if we define $u_{k} := u(x_{k}) $; $ x_{k} = kh $ and $ k = 0,1,2,...,N$. $h$ is known as the mesh size or step size. We get:



      $$ frac{d^2u_{k}}{dx^2} approx frac{u_{k+1}-2u_{k}+u_{k-1}}{h^2} = frac{u_{k-1}-2u_{k}+u_{k+1}}{h^2} $$ for $k=1,...,N-1$



      Since $u(0) = u_{0} = 0$ and $ u(L) = u(x_{N}) = u_{N} = 0 $ we get the following matrix representation of the second derivative operator



      begin{equation}
      frac{d^2}{dx^2} approx L_{2} = frac{1}{h^2}left(begin{matrix}
      -2 & 1 & & 0\
      1 & ddots & ddots & \
      & ddots & ddots & 1 \
      0 & & 1 & -2 end{matrix} right)
      end{equation}



      Then to approximate the solution of the differental equation we solve the system:



      $$-L_{2}hat{u} = hat{f}$$
      where $ hat{f} = [ f(x_{1}) f(x_{2}) ... f(x_{N-1}) ]^T $ and $hat{u} = [ u_{1} ... u_{N-1} ]^T$



      The matrix representation of the laplacian operator using the Kronecker product is: $$ Delta =nabla^2 = frac{partial^2}{partial x^2} + frac{partial ^2}{partial y^2} = L_{2}otimes I_{n_{x}} + I_{n_{y}} otimes L_{2} $$



      $textbf{Note that I have used $L_{2}$ and kronecker product to get a matrix representation} $ of the laplacian operator.



      With this in mind. I want to approximate the first derivative using central difference:



      $$ frac{du_{k}}{dx} approx frac{ u_{k+1}-u_{k-1} }{ 2h } = frac{ -u_{k-1} +u_{k+1} }{ 2h } $$



      Since $u(0) = u_{0} = 0$ and $ u(L) = u(x_{N}) = u_{N} = 0 $ we get the following matrix representation of the first derivative



      begin{equation}
      frac{d}{dx} approx L_{1} = frac{1}{2h}left(begin{matrix}
      0 & 1 & & 0\
      -1 & ddots & ddots & \
      & ddots & ddots & 1 \
      0 & & -1 & 0 end{matrix} right)
      end{equation}



      I want to use $L_{1}$ and kronecker product to get the matrix representation of



      $$ frac{partial^2 }{partial y partial x } $$
      and
      $$ frac{partial^2 }{partial x partial y } $$



      $textbf{My questions are:} $




      1. Is this possible?


      2. if question 1. is affirmative, what is the matrix representation of the mixed derivatives using $L_{1}$ and kronecker product?


      3. if question 1. is negative, how can we get the matrix representation of the mixed derivatives using a simple matrix and kronecker product?


      4. Do you know a book or document( article or other ) that explain in detail this or something similar?



      $textbf{I want to use kronecker product because it is fast and easy to implement in} $ matlab or octave.



      By the way I tried to use this formula



      $$ frac{partial^2u_{k,j} }{partial x partial y } = frac{ u_{k+1,j+1}+u_{k-1,j-1}-u_{k+1,j-1}-u_{k-1,j+1} }{4h^2} $$



      But it was hard to see a pattern. Thank you!










      share|cite|improve this question















      Imagine we have the problem



      begin{cases}
      -frac{d^2u}{dx^2} = f(x), x in [0,L] \
      u(0) = 0 \
      u(L) = 0
      end{cases}



      We know that we can approximate the second derivative using this formula:$$ frac{d^2u(x)}{dx^2} approx frac{u(x+h)-2u(x)+u(x-h)}{h^2} $$



      if we define $u_{k} := u(x_{k}) $; $ x_{k} = kh $ and $ k = 0,1,2,...,N$. $h$ is known as the mesh size or step size. We get:



      $$ frac{d^2u_{k}}{dx^2} approx frac{u_{k+1}-2u_{k}+u_{k-1}}{h^2} = frac{u_{k-1}-2u_{k}+u_{k+1}}{h^2} $$ for $k=1,...,N-1$



      Since $u(0) = u_{0} = 0$ and $ u(L) = u(x_{N}) = u_{N} = 0 $ we get the following matrix representation of the second derivative operator



      begin{equation}
      frac{d^2}{dx^2} approx L_{2} = frac{1}{h^2}left(begin{matrix}
      -2 & 1 & & 0\
      1 & ddots & ddots & \
      & ddots & ddots & 1 \
      0 & & 1 & -2 end{matrix} right)
      end{equation}



      Then to approximate the solution of the differental equation we solve the system:



      $$-L_{2}hat{u} = hat{f}$$
      where $ hat{f} = [ f(x_{1}) f(x_{2}) ... f(x_{N-1}) ]^T $ and $hat{u} = [ u_{1} ... u_{N-1} ]^T$



      The matrix representation of the laplacian operator using the Kronecker product is: $$ Delta =nabla^2 = frac{partial^2}{partial x^2} + frac{partial ^2}{partial y^2} = L_{2}otimes I_{n_{x}} + I_{n_{y}} otimes L_{2} $$



      $textbf{Note that I have used $L_{2}$ and kronecker product to get a matrix representation} $ of the laplacian operator.



      With this in mind. I want to approximate the first derivative using central difference:



      $$ frac{du_{k}}{dx} approx frac{ u_{k+1}-u_{k-1} }{ 2h } = frac{ -u_{k-1} +u_{k+1} }{ 2h } $$



      Since $u(0) = u_{0} = 0$ and $ u(L) = u(x_{N}) = u_{N} = 0 $ we get the following matrix representation of the first derivative



      begin{equation}
      frac{d}{dx} approx L_{1} = frac{1}{2h}left(begin{matrix}
      0 & 1 & & 0\
      -1 & ddots & ddots & \
      & ddots & ddots & 1 \
      0 & & -1 & 0 end{matrix} right)
      end{equation}



      I want to use $L_{1}$ and kronecker product to get the matrix representation of



      $$ frac{partial^2 }{partial y partial x } $$
      and
      $$ frac{partial^2 }{partial x partial y } $$



      $textbf{My questions are:} $




      1. Is this possible?


      2. if question 1. is affirmative, what is the matrix representation of the mixed derivatives using $L_{1}$ and kronecker product?


      3. if question 1. is negative, how can we get the matrix representation of the mixed derivatives using a simple matrix and kronecker product?


      4. Do you know a book or document( article or other ) that explain in detail this or something similar?



      $textbf{I want to use kronecker product because it is fast and easy to implement in} $ matlab or octave.



      By the way I tried to use this formula



      $$ frac{partial^2u_{k,j} }{partial x partial y } = frac{ u_{k+1,j+1}+u_{k-1,j-1}-u_{k+1,j-1}-u_{k-1,j+1} }{4h^2} $$



      But it was hard to see a pattern. Thank you!







      numerical-methods partial-derivative kronecker-product






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Nov 14 at 11:28

























      asked Nov 14 at 11:02









      tnt235711

      1318




      1318






















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          0
          down vote













          Once you have the matrix $bf L_1$ to represent the divided difference of $f(x)$ given as a vertical vector, then if you transpose the whole you'll get a horizontal vector for the difference.



          So if $f(x,y)$ is represented as a matrix $bf F_{, x,, y}$ , the matrix representing ${partial over {partial xpartial y}}f(x,y)$ will be $bf L_1 bf F_{, x, ,y} bf L_1^T$






          share|cite|improve this answer























          • Honestly I can not imagine the matrix...
            – tnt235711
            Nov 14 at 11:45










          • @tnt235711: added
            – G Cab
            Nov 14 at 12:19










          • @tnt235711: is it ok now for you ?
            – G Cab
            Nov 15 at 16:55


















          up vote
          0
          down vote



          accepted










          Now using $mathbf{L_{1}}$ and $textbf{kronecker product}$:



          $$frac{partial^2 }{partial x partial y} = Big((L_{1})_{n_{x}} otimes I_{n_{y}} Big)Big( I_{n_{x}} otimes (L_{1})_{n_{y}} Big) = { (L_{1})_{n_{x}} otimes (L_{1})_{n_{y}} } $$



          I have used the mixed-product property for kronecker product.
          $$ (mathbf{A} otimes mathbf{B})(mathbf{C} otimes mathbf{D}) = (mathbf{AC}) otimes (mathbf{BD})$$.



          In a similar way we get:



          $$frac{partial^2 }{partial y partial x} = (L_{1})_{ n_{y} } otimes (L_{1})_{n_{x}} $$






          share|cite|improve this answer





















            Your Answer





            StackExchange.ifUsing("editor", function () {
            return StackExchange.using("mathjaxEditing", function () {
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            });
            });
            }, "mathjax-editing");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "69"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














             

            draft saved


            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2998130%2fmatrix-representation-of-mixed-derivatives%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes








            up vote
            0
            down vote













            Once you have the matrix $bf L_1$ to represent the divided difference of $f(x)$ given as a vertical vector, then if you transpose the whole you'll get a horizontal vector for the difference.



            So if $f(x,y)$ is represented as a matrix $bf F_{, x,, y}$ , the matrix representing ${partial over {partial xpartial y}}f(x,y)$ will be $bf L_1 bf F_{, x, ,y} bf L_1^T$






            share|cite|improve this answer























            • Honestly I can not imagine the matrix...
              – tnt235711
              Nov 14 at 11:45










            • @tnt235711: added
              – G Cab
              Nov 14 at 12:19










            • @tnt235711: is it ok now for you ?
              – G Cab
              Nov 15 at 16:55















            up vote
            0
            down vote













            Once you have the matrix $bf L_1$ to represent the divided difference of $f(x)$ given as a vertical vector, then if you transpose the whole you'll get a horizontal vector for the difference.



            So if $f(x,y)$ is represented as a matrix $bf F_{, x,, y}$ , the matrix representing ${partial over {partial xpartial y}}f(x,y)$ will be $bf L_1 bf F_{, x, ,y} bf L_1^T$






            share|cite|improve this answer























            • Honestly I can not imagine the matrix...
              – tnt235711
              Nov 14 at 11:45










            • @tnt235711: added
              – G Cab
              Nov 14 at 12:19










            • @tnt235711: is it ok now for you ?
              – G Cab
              Nov 15 at 16:55













            up vote
            0
            down vote










            up vote
            0
            down vote









            Once you have the matrix $bf L_1$ to represent the divided difference of $f(x)$ given as a vertical vector, then if you transpose the whole you'll get a horizontal vector for the difference.



            So if $f(x,y)$ is represented as a matrix $bf F_{, x,, y}$ , the matrix representing ${partial over {partial xpartial y}}f(x,y)$ will be $bf L_1 bf F_{, x, ,y} bf L_1^T$






            share|cite|improve this answer














            Once you have the matrix $bf L_1$ to represent the divided difference of $f(x)$ given as a vertical vector, then if you transpose the whole you'll get a horizontal vector for the difference.



            So if $f(x,y)$ is represented as a matrix $bf F_{, x,, y}$ , the matrix representing ${partial over {partial xpartial y}}f(x,y)$ will be $bf L_1 bf F_{, x, ,y} bf L_1^T$







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited Nov 14 at 12:18

























            answered Nov 14 at 11:34









            G Cab

            16.9k31237




            16.9k31237












            • Honestly I can not imagine the matrix...
              – tnt235711
              Nov 14 at 11:45










            • @tnt235711: added
              – G Cab
              Nov 14 at 12:19










            • @tnt235711: is it ok now for you ?
              – G Cab
              Nov 15 at 16:55


















            • Honestly I can not imagine the matrix...
              – tnt235711
              Nov 14 at 11:45










            • @tnt235711: added
              – G Cab
              Nov 14 at 12:19










            • @tnt235711: is it ok now for you ?
              – G Cab
              Nov 15 at 16:55
















            Honestly I can not imagine the matrix...
            – tnt235711
            Nov 14 at 11:45




            Honestly I can not imagine the matrix...
            – tnt235711
            Nov 14 at 11:45












            @tnt235711: added
            – G Cab
            Nov 14 at 12:19




            @tnt235711: added
            – G Cab
            Nov 14 at 12:19












            @tnt235711: is it ok now for you ?
            – G Cab
            Nov 15 at 16:55




            @tnt235711: is it ok now for you ?
            – G Cab
            Nov 15 at 16:55










            up vote
            0
            down vote



            accepted










            Now using $mathbf{L_{1}}$ and $textbf{kronecker product}$:



            $$frac{partial^2 }{partial x partial y} = Big((L_{1})_{n_{x}} otimes I_{n_{y}} Big)Big( I_{n_{x}} otimes (L_{1})_{n_{y}} Big) = { (L_{1})_{n_{x}} otimes (L_{1})_{n_{y}} } $$



            I have used the mixed-product property for kronecker product.
            $$ (mathbf{A} otimes mathbf{B})(mathbf{C} otimes mathbf{D}) = (mathbf{AC}) otimes (mathbf{BD})$$.



            In a similar way we get:



            $$frac{partial^2 }{partial y partial x} = (L_{1})_{ n_{y} } otimes (L_{1})_{n_{x}} $$






            share|cite|improve this answer

























              up vote
              0
              down vote



              accepted










              Now using $mathbf{L_{1}}$ and $textbf{kronecker product}$:



              $$frac{partial^2 }{partial x partial y} = Big((L_{1})_{n_{x}} otimes I_{n_{y}} Big)Big( I_{n_{x}} otimes (L_{1})_{n_{y}} Big) = { (L_{1})_{n_{x}} otimes (L_{1})_{n_{y}} } $$



              I have used the mixed-product property for kronecker product.
              $$ (mathbf{A} otimes mathbf{B})(mathbf{C} otimes mathbf{D}) = (mathbf{AC}) otimes (mathbf{BD})$$.



              In a similar way we get:



              $$frac{partial^2 }{partial y partial x} = (L_{1})_{ n_{y} } otimes (L_{1})_{n_{x}} $$






              share|cite|improve this answer























                up vote
                0
                down vote



                accepted







                up vote
                0
                down vote



                accepted






                Now using $mathbf{L_{1}}$ and $textbf{kronecker product}$:



                $$frac{partial^2 }{partial x partial y} = Big((L_{1})_{n_{x}} otimes I_{n_{y}} Big)Big( I_{n_{x}} otimes (L_{1})_{n_{y}} Big) = { (L_{1})_{n_{x}} otimes (L_{1})_{n_{y}} } $$



                I have used the mixed-product property for kronecker product.
                $$ (mathbf{A} otimes mathbf{B})(mathbf{C} otimes mathbf{D}) = (mathbf{AC}) otimes (mathbf{BD})$$.



                In a similar way we get:



                $$frac{partial^2 }{partial y partial x} = (L_{1})_{ n_{y} } otimes (L_{1})_{n_{x}} $$






                share|cite|improve this answer












                Now using $mathbf{L_{1}}$ and $textbf{kronecker product}$:



                $$frac{partial^2 }{partial x partial y} = Big((L_{1})_{n_{x}} otimes I_{n_{y}} Big)Big( I_{n_{x}} otimes (L_{1})_{n_{y}} Big) = { (L_{1})_{n_{x}} otimes (L_{1})_{n_{y}} } $$



                I have used the mixed-product property for kronecker product.
                $$ (mathbf{A} otimes mathbf{B})(mathbf{C} otimes mathbf{D}) = (mathbf{AC}) otimes (mathbf{BD})$$.



                In a similar way we get:



                $$frac{partial^2 }{partial y partial x} = (L_{1})_{ n_{y} } otimes (L_{1})_{n_{x}} $$







                share|cite|improve this answer












                share|cite|improve this answer



                share|cite|improve this answer










                answered Nov 16 at 14:48









                tnt235711

                1318




                1318






























                     

                    draft saved


                    draft discarded



















































                     


                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2998130%2fmatrix-representation-of-mixed-derivatives%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Quarter-circle Tiles

                    build a pushdown automaton that recognizes the reverse language of a given pushdown automaton?

                    Mont Emei