Obtaining the gradient of a vector-valued function











up vote
0
down vote

favorite












I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.



Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)



How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?










share|cite|improve this question






















  • A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
    – mathreadler
    12 hours ago












  • Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
    – Masacroso
    11 hours ago

















up vote
0
down vote

favorite












I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.



Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)



How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?










share|cite|improve this question






















  • A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
    – mathreadler
    12 hours ago












  • Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
    – Masacroso
    11 hours ago















up vote
0
down vote

favorite









up vote
0
down vote

favorite











I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.



Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)



How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?










share|cite|improve this question













I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.



Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)



How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?







derivatives vector-spaces gradient-descent






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 12 hours ago









The Bosco

468211




468211












  • A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
    – mathreadler
    12 hours ago












  • Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
    – Masacroso
    11 hours ago




















  • A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
    – mathreadler
    12 hours ago












  • Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
    – Masacroso
    11 hours ago


















A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
12 hours ago






A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
12 hours ago














Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
11 hours ago






Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
11 hours ago












2 Answers
2






active

oldest

votes

















up vote
0
down vote













The 'gradient' of something usually means taking all partial derivatives. Therefore



"taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"



is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.






share|cite|improve this answer








New contributor




maxmilgram is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.


















  • This is a homework question. For some specific function $F$ I have to do that.
    – The Bosco
    12 hours ago










  • The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
    – maxmilgram
    12 hours ago


















up vote
0
down vote













A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.



Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).






share|cite|improve this answer





















    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














     

    draft saved


    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2999410%2fobtaining-the-gradient-of-a-vector-valued-function%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    0
    down vote













    The 'gradient' of something usually means taking all partial derivatives. Therefore



    "taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"



    is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.






    share|cite|improve this answer








    New contributor




    maxmilgram is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.


















    • This is a homework question. For some specific function $F$ I have to do that.
      – The Bosco
      12 hours ago










    • The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
      – maxmilgram
      12 hours ago















    up vote
    0
    down vote













    The 'gradient' of something usually means taking all partial derivatives. Therefore



    "taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"



    is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.






    share|cite|improve this answer








    New contributor




    maxmilgram is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.


















    • This is a homework question. For some specific function $F$ I have to do that.
      – The Bosco
      12 hours ago










    • The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
      – maxmilgram
      12 hours ago













    up vote
    0
    down vote










    up vote
    0
    down vote









    The 'gradient' of something usually means taking all partial derivatives. Therefore



    "taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"



    is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.






    share|cite|improve this answer








    New contributor




    maxmilgram is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.









    The 'gradient' of something usually means taking all partial derivatives. Therefore



    "taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"



    is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.







    share|cite|improve this answer








    New contributor




    maxmilgram is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.









    share|cite|improve this answer



    share|cite|improve this answer






    New contributor




    maxmilgram is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.









    answered 12 hours ago









    maxmilgram

    3435




    3435




    New contributor




    maxmilgram is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.





    New contributor





    maxmilgram is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.






    maxmilgram is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.












    • This is a homework question. For some specific function $F$ I have to do that.
      – The Bosco
      12 hours ago










    • The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
      – maxmilgram
      12 hours ago


















    • This is a homework question. For some specific function $F$ I have to do that.
      – The Bosco
      12 hours ago










    • The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
      – maxmilgram
      12 hours ago
















    This is a homework question. For some specific function $F$ I have to do that.
    – The Bosco
    12 hours ago




    This is a homework question. For some specific function $F$ I have to do that.
    – The Bosco
    12 hours ago












    The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
    – maxmilgram
    12 hours ago




    The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
    – maxmilgram
    12 hours ago










    up vote
    0
    down vote













    A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.



    Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).






    share|cite|improve this answer

























      up vote
      0
      down vote













      A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.



      Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).






      share|cite|improve this answer























        up vote
        0
        down vote










        up vote
        0
        down vote









        A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.



        Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).






        share|cite|improve this answer












        A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.



        Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered 12 hours ago









        mathreadler

        14.5k72160




        14.5k72160






























             

            draft saved


            draft discarded



















































             


            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2999410%2fobtaining-the-gradient-of-a-vector-valued-function%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Quarter-circle Tiles

            build a pushdown automaton that recognizes the reverse language of a given pushdown automaton?

            Mont Emei