Matrix calculus for statistics
up vote
2
down vote
favorite
I want to calculate the variance of a sum of random variables $ (X_1+X_2) $.
Doing statistics, I have to learn maths starting from the end and it is quite difficult, yet very interesting (please consider that I only have the very basic skills in maths).
For now I am doing this calculus manually with the formula $ mathrm{var}(X_1+X_2) = mathrm{var}(X_1) + mathrm{var}(X_2) + 2mathrm{cov}(X_1,X_2)$.
But I am now facing much larger sums (with some minus) and being able to do so with matrix calculation would save me a lot of time (and would be very satisfying too).
I searched every resource in matrix calculus but couldn't find anything usable with my knowledge.
How can I do this calculus from the variance-covariance matrix
$$
begin{pmatrix}
mathrm{var}(X_1) & mathrm{cov}(X_1,X_2) \
mathrm{cov}(X_1,X_2) & mathrm{var}(X_2) \
end{pmatrix}
$$
[preferentially extended to substractions and n terms, like $ (X_1+X_2-X_3) $]?
NB: this is not a statistic question and doesn't belong to stats.stackexchange. I want to understand the thought process of turning scalar calculation to matrix.
matrices descriptive-statistics
add a comment |
up vote
2
down vote
favorite
I want to calculate the variance of a sum of random variables $ (X_1+X_2) $.
Doing statistics, I have to learn maths starting from the end and it is quite difficult, yet very interesting (please consider that I only have the very basic skills in maths).
For now I am doing this calculus manually with the formula $ mathrm{var}(X_1+X_2) = mathrm{var}(X_1) + mathrm{var}(X_2) + 2mathrm{cov}(X_1,X_2)$.
But I am now facing much larger sums (with some minus) and being able to do so with matrix calculation would save me a lot of time (and would be very satisfying too).
I searched every resource in matrix calculus but couldn't find anything usable with my knowledge.
How can I do this calculus from the variance-covariance matrix
$$
begin{pmatrix}
mathrm{var}(X_1) & mathrm{cov}(X_1,X_2) \
mathrm{cov}(X_1,X_2) & mathrm{var}(X_2) \
end{pmatrix}
$$
[preferentially extended to substractions and n terms, like $ (X_1+X_2-X_3) $]?
NB: this is not a statistic question and doesn't belong to stats.stackexchange. I want to understand the thought process of turning scalar calculation to matrix.
matrices descriptive-statistics
add a comment |
up vote
2
down vote
favorite
up vote
2
down vote
favorite
I want to calculate the variance of a sum of random variables $ (X_1+X_2) $.
Doing statistics, I have to learn maths starting from the end and it is quite difficult, yet very interesting (please consider that I only have the very basic skills in maths).
For now I am doing this calculus manually with the formula $ mathrm{var}(X_1+X_2) = mathrm{var}(X_1) + mathrm{var}(X_2) + 2mathrm{cov}(X_1,X_2)$.
But I am now facing much larger sums (with some minus) and being able to do so with matrix calculation would save me a lot of time (and would be very satisfying too).
I searched every resource in matrix calculus but couldn't find anything usable with my knowledge.
How can I do this calculus from the variance-covariance matrix
$$
begin{pmatrix}
mathrm{var}(X_1) & mathrm{cov}(X_1,X_2) \
mathrm{cov}(X_1,X_2) & mathrm{var}(X_2) \
end{pmatrix}
$$
[preferentially extended to substractions and n terms, like $ (X_1+X_2-X_3) $]?
NB: this is not a statistic question and doesn't belong to stats.stackexchange. I want to understand the thought process of turning scalar calculation to matrix.
matrices descriptive-statistics
I want to calculate the variance of a sum of random variables $ (X_1+X_2) $.
Doing statistics, I have to learn maths starting from the end and it is quite difficult, yet very interesting (please consider that I only have the very basic skills in maths).
For now I am doing this calculus manually with the formula $ mathrm{var}(X_1+X_2) = mathrm{var}(X_1) + mathrm{var}(X_2) + 2mathrm{cov}(X_1,X_2)$.
But I am now facing much larger sums (with some minus) and being able to do so with matrix calculation would save me a lot of time (and would be very satisfying too).
I searched every resource in matrix calculus but couldn't find anything usable with my knowledge.
How can I do this calculus from the variance-covariance matrix
$$
begin{pmatrix}
mathrm{var}(X_1) & mathrm{cov}(X_1,X_2) \
mathrm{cov}(X_1,X_2) & mathrm{var}(X_2) \
end{pmatrix}
$$
[preferentially extended to substractions and n terms, like $ (X_1+X_2-X_3) $]?
NB: this is not a statistic question and doesn't belong to stats.stackexchange. I want to understand the thought process of turning scalar calculation to matrix.
matrices descriptive-statistics
matrices descriptive-statistics
edited Nov 22 at 15:10
Jean-Claude Arbaut
14.9k63362
14.9k63362
asked Nov 22 at 12:31
Dan Chaltiel
1134
1134
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
up vote
3
down vote
accepted
The variance-covariance matrix of $X$ is $frac1n(X-bar X)^T(X-bar X)$.
Now, you want to compute the variance of the vector $u=Xbeta$ for some vector $beta$. This variance is
$$Var(u)=frac1n(u-bar u)^T(u-bar u)=frac1n(Xbeta-bar Xbeta)^T(Xbeta-bar Xbeta)\
=frac1nbeta^T(X-bar X)^T(X-bar X)beta=beta^TVar(X)beta$$
+1 Elegant solution
– caverac
Nov 22 at 13:12
Great answer, thanks ! Learned a lot with this !
– Dan Chaltiel
Nov 22 at 13:28
add a comment |
up vote
2
down vote
The key point here is that
$$
mathbb{V}{rm ar}[X] = mathbb{C}{rm ov}[X, X]
$$
so that you can express your first expression as
begin{eqnarray}
mathbb{V}{rm ar}[a_1 X_1 + a_2 X_2] &=& a_1^2mathbb{V}{rm ar}[X_1] + a_2^2mathbb{V}{rm ar}[X_2] + 2 a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
&=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + 2a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
&=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + a_1a_2 mathbb{C}{rm ov}[X_1, X_2] + a_2a_1 mathbb{C}{rm ov}[X_2, X_1] \
&=& sum_{i=1}^2 sum_{j=1}^2 a_i a_j mathbb{C}{rm ov}[X_i, X_j]
end{eqnarray}
In general
begin{eqnarray}
mathbb{V}{rm ar}[a_1 X_1 + cdots a_n X_n] &=& sum_{i=1}^n sum_{j=1}^n a_i a_j mathbb{C}{rm ov}[X_i, X_j]
end{eqnarray}
1
+1, but I think the OP wanted to see how this is done with matrices, that is how to write it as $a^TVar(X)a$.
– Jean-Claude Arbaut
Nov 22 at 12:57
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
3
down vote
accepted
The variance-covariance matrix of $X$ is $frac1n(X-bar X)^T(X-bar X)$.
Now, you want to compute the variance of the vector $u=Xbeta$ for some vector $beta$. This variance is
$$Var(u)=frac1n(u-bar u)^T(u-bar u)=frac1n(Xbeta-bar Xbeta)^T(Xbeta-bar Xbeta)\
=frac1nbeta^T(X-bar X)^T(X-bar X)beta=beta^TVar(X)beta$$
+1 Elegant solution
– caverac
Nov 22 at 13:12
Great answer, thanks ! Learned a lot with this !
– Dan Chaltiel
Nov 22 at 13:28
add a comment |
up vote
3
down vote
accepted
The variance-covariance matrix of $X$ is $frac1n(X-bar X)^T(X-bar X)$.
Now, you want to compute the variance of the vector $u=Xbeta$ for some vector $beta$. This variance is
$$Var(u)=frac1n(u-bar u)^T(u-bar u)=frac1n(Xbeta-bar Xbeta)^T(Xbeta-bar Xbeta)\
=frac1nbeta^T(X-bar X)^T(X-bar X)beta=beta^TVar(X)beta$$
+1 Elegant solution
– caverac
Nov 22 at 13:12
Great answer, thanks ! Learned a lot with this !
– Dan Chaltiel
Nov 22 at 13:28
add a comment |
up vote
3
down vote
accepted
up vote
3
down vote
accepted
The variance-covariance matrix of $X$ is $frac1n(X-bar X)^T(X-bar X)$.
Now, you want to compute the variance of the vector $u=Xbeta$ for some vector $beta$. This variance is
$$Var(u)=frac1n(u-bar u)^T(u-bar u)=frac1n(Xbeta-bar Xbeta)^T(Xbeta-bar Xbeta)\
=frac1nbeta^T(X-bar X)^T(X-bar X)beta=beta^TVar(X)beta$$
The variance-covariance matrix of $X$ is $frac1n(X-bar X)^T(X-bar X)$.
Now, you want to compute the variance of the vector $u=Xbeta$ for some vector $beta$. This variance is
$$Var(u)=frac1n(u-bar u)^T(u-bar u)=frac1n(Xbeta-bar Xbeta)^T(Xbeta-bar Xbeta)\
=frac1nbeta^T(X-bar X)^T(X-bar X)beta=beta^TVar(X)beta$$
edited Nov 22 at 13:25
answered Nov 22 at 13:00
Jean-Claude Arbaut
14.9k63362
14.9k63362
+1 Elegant solution
– caverac
Nov 22 at 13:12
Great answer, thanks ! Learned a lot with this !
– Dan Chaltiel
Nov 22 at 13:28
add a comment |
+1 Elegant solution
– caverac
Nov 22 at 13:12
Great answer, thanks ! Learned a lot with this !
– Dan Chaltiel
Nov 22 at 13:28
+1 Elegant solution
– caverac
Nov 22 at 13:12
+1 Elegant solution
– caverac
Nov 22 at 13:12
Great answer, thanks ! Learned a lot with this !
– Dan Chaltiel
Nov 22 at 13:28
Great answer, thanks ! Learned a lot with this !
– Dan Chaltiel
Nov 22 at 13:28
add a comment |
up vote
2
down vote
The key point here is that
$$
mathbb{V}{rm ar}[X] = mathbb{C}{rm ov}[X, X]
$$
so that you can express your first expression as
begin{eqnarray}
mathbb{V}{rm ar}[a_1 X_1 + a_2 X_2] &=& a_1^2mathbb{V}{rm ar}[X_1] + a_2^2mathbb{V}{rm ar}[X_2] + 2 a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
&=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + 2a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
&=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + a_1a_2 mathbb{C}{rm ov}[X_1, X_2] + a_2a_1 mathbb{C}{rm ov}[X_2, X_1] \
&=& sum_{i=1}^2 sum_{j=1}^2 a_i a_j mathbb{C}{rm ov}[X_i, X_j]
end{eqnarray}
In general
begin{eqnarray}
mathbb{V}{rm ar}[a_1 X_1 + cdots a_n X_n] &=& sum_{i=1}^n sum_{j=1}^n a_i a_j mathbb{C}{rm ov}[X_i, X_j]
end{eqnarray}
1
+1, but I think the OP wanted to see how this is done with matrices, that is how to write it as $a^TVar(X)a$.
– Jean-Claude Arbaut
Nov 22 at 12:57
add a comment |
up vote
2
down vote
The key point here is that
$$
mathbb{V}{rm ar}[X] = mathbb{C}{rm ov}[X, X]
$$
so that you can express your first expression as
begin{eqnarray}
mathbb{V}{rm ar}[a_1 X_1 + a_2 X_2] &=& a_1^2mathbb{V}{rm ar}[X_1] + a_2^2mathbb{V}{rm ar}[X_2] + 2 a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
&=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + 2a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
&=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + a_1a_2 mathbb{C}{rm ov}[X_1, X_2] + a_2a_1 mathbb{C}{rm ov}[X_2, X_1] \
&=& sum_{i=1}^2 sum_{j=1}^2 a_i a_j mathbb{C}{rm ov}[X_i, X_j]
end{eqnarray}
In general
begin{eqnarray}
mathbb{V}{rm ar}[a_1 X_1 + cdots a_n X_n] &=& sum_{i=1}^n sum_{j=1}^n a_i a_j mathbb{C}{rm ov}[X_i, X_j]
end{eqnarray}
1
+1, but I think the OP wanted to see how this is done with matrices, that is how to write it as $a^TVar(X)a$.
– Jean-Claude Arbaut
Nov 22 at 12:57
add a comment |
up vote
2
down vote
up vote
2
down vote
The key point here is that
$$
mathbb{V}{rm ar}[X] = mathbb{C}{rm ov}[X, X]
$$
so that you can express your first expression as
begin{eqnarray}
mathbb{V}{rm ar}[a_1 X_1 + a_2 X_2] &=& a_1^2mathbb{V}{rm ar}[X_1] + a_2^2mathbb{V}{rm ar}[X_2] + 2 a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
&=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + 2a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
&=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + a_1a_2 mathbb{C}{rm ov}[X_1, X_2] + a_2a_1 mathbb{C}{rm ov}[X_2, X_1] \
&=& sum_{i=1}^2 sum_{j=1}^2 a_i a_j mathbb{C}{rm ov}[X_i, X_j]
end{eqnarray}
In general
begin{eqnarray}
mathbb{V}{rm ar}[a_1 X_1 + cdots a_n X_n] &=& sum_{i=1}^n sum_{j=1}^n a_i a_j mathbb{C}{rm ov}[X_i, X_j]
end{eqnarray}
The key point here is that
$$
mathbb{V}{rm ar}[X] = mathbb{C}{rm ov}[X, X]
$$
so that you can express your first expression as
begin{eqnarray}
mathbb{V}{rm ar}[a_1 X_1 + a_2 X_2] &=& a_1^2mathbb{V}{rm ar}[X_1] + a_2^2mathbb{V}{rm ar}[X_2] + 2 a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
&=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + 2a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
&=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + a_1a_2 mathbb{C}{rm ov}[X_1, X_2] + a_2a_1 mathbb{C}{rm ov}[X_2, X_1] \
&=& sum_{i=1}^2 sum_{j=1}^2 a_i a_j mathbb{C}{rm ov}[X_i, X_j]
end{eqnarray}
In general
begin{eqnarray}
mathbb{V}{rm ar}[a_1 X_1 + cdots a_n X_n] &=& sum_{i=1}^n sum_{j=1}^n a_i a_j mathbb{C}{rm ov}[X_i, X_j]
end{eqnarray}
answered Nov 22 at 12:56
caverac
11.7k21027
11.7k21027
1
+1, but I think the OP wanted to see how this is done with matrices, that is how to write it as $a^TVar(X)a$.
– Jean-Claude Arbaut
Nov 22 at 12:57
add a comment |
1
+1, but I think the OP wanted to see how this is done with matrices, that is how to write it as $a^TVar(X)a$.
– Jean-Claude Arbaut
Nov 22 at 12:57
1
1
+1, but I think the OP wanted to see how this is done with matrices, that is how to write it as $a^TVar(X)a$.
– Jean-Claude Arbaut
Nov 22 at 12:57
+1, but I think the OP wanted to see how this is done with matrices, that is how to write it as $a^TVar(X)a$.
– Jean-Claude Arbaut
Nov 22 at 12:57
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3009073%2fmatrix-calculus-for-statistics%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown