Obtaining the gradient of a vector-valued function
up vote
0
down vote
favorite
I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.
Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)
How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?
derivatives vector-spaces gradient-descent
add a comment |
up vote
0
down vote
favorite
I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.
Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)
How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?
derivatives vector-spaces gradient-descent
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
12 hours ago
Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
11 hours ago
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.
Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)
How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?
derivatives vector-spaces gradient-descent
I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.
Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)
How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?
derivatives vector-spaces gradient-descent
derivatives vector-spaces gradient-descent
asked 12 hours ago
The Bosco
468211
468211
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
12 hours ago
Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
11 hours ago
add a comment |
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
12 hours ago
Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
11 hours ago
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
12 hours ago
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
12 hours ago
Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
11 hours ago
Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
11 hours ago
add a comment |
2 Answers
2
active
oldest
votes
up vote
0
down vote
The 'gradient' of something usually means taking all partial derivatives. Therefore
"taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"
is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.
New contributor
This is a homework question. For some specific function $F$ I have to do that.
– The Bosco
12 hours ago
The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
– maxmilgram
12 hours ago
add a comment |
up vote
0
down vote
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.
Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
The 'gradient' of something usually means taking all partial derivatives. Therefore
"taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"
is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.
New contributor
This is a homework question. For some specific function $F$ I have to do that.
– The Bosco
12 hours ago
The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
– maxmilgram
12 hours ago
add a comment |
up vote
0
down vote
The 'gradient' of something usually means taking all partial derivatives. Therefore
"taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"
is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.
New contributor
This is a homework question. For some specific function $F$ I have to do that.
– The Bosco
12 hours ago
The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
– maxmilgram
12 hours ago
add a comment |
up vote
0
down vote
up vote
0
down vote
The 'gradient' of something usually means taking all partial derivatives. Therefore
"taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"
is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.
New contributor
The 'gradient' of something usually means taking all partial derivatives. Therefore
"taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"
is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.
New contributor
New contributor
answered 12 hours ago
maxmilgram
3435
3435
New contributor
New contributor
This is a homework question. For some specific function $F$ I have to do that.
– The Bosco
12 hours ago
The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
– maxmilgram
12 hours ago
add a comment |
This is a homework question. For some specific function $F$ I have to do that.
– The Bosco
12 hours ago
The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
– maxmilgram
12 hours ago
This is a homework question. For some specific function $F$ I have to do that.
– The Bosco
12 hours ago
This is a homework question. For some specific function $F$ I have to do that.
– The Bosco
12 hours ago
The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
– maxmilgram
12 hours ago
The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
– maxmilgram
12 hours ago
add a comment |
up vote
0
down vote
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.
Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
add a comment |
up vote
0
down vote
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.
Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
add a comment |
up vote
0
down vote
up vote
0
down vote
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.
Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.
Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
answered 12 hours ago
mathreadler
14.5k72160
14.5k72160
add a comment |
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2999410%2fobtaining-the-gradient-of-a-vector-valued-function%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
12 hours ago
Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
11 hours ago