Doing regression only with correlation matrix, means, and SDs [duplicate]
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty{ margin-bottom:0;
}
up vote
1
down vote
favorite
This question already has an answer here:
Is there a way to use the covariance matrix to find coefficients for multiple regression?
1 answer
I was wondering how mathematically is it possible to run a full regression analysis between 3 predictors (x1 x2 x3
) and a dependent variable (y
) by only knowing the: Means, Ns, SDs, and the Correlations between all these 4 variables (without the original data)?
I highly appreciate an R
demonstration.
ns <- c(273, 273, 273, 273)
means <- c(15.4, 7.1, 3.5, 6.2)
sds <- c(3.4, 0.9, 1.5, 1.4)
r <- matrix( c(
1.0, .57, -.4, .48,
.57, 1.0, -.61, .66,
-.4, -.61, 1.0, -.68,
.48, .66, -.68, 1.0), 4)
rownames(r) <- colnames(r) <- paste0('x', 1:4)
r regression multiple-regression regression-coefficients
marked as duplicate by whuber♦
StackExchange.ready(function() {
if (StackExchange.options.isMobile) return;
$('.dupe-hammer-message-hover:not(.hover-bound)').each(function() {
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');
$hover.hover(
function() {
$hover.showInfoMessage('', {
messageElement: $msg.clone().show(),
transient: false,
position: { my: 'bottom left', at: 'top center', offsetTop: -7 },
dismissable: false,
relativeToBody: true
});
},
function() {
StackExchange.helpers.removeMessages();
}
);
});
});
yesterday
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
add a comment |
up vote
1
down vote
favorite
This question already has an answer here:
Is there a way to use the covariance matrix to find coefficients for multiple regression?
1 answer
I was wondering how mathematically is it possible to run a full regression analysis between 3 predictors (x1 x2 x3
) and a dependent variable (y
) by only knowing the: Means, Ns, SDs, and the Correlations between all these 4 variables (without the original data)?
I highly appreciate an R
demonstration.
ns <- c(273, 273, 273, 273)
means <- c(15.4, 7.1, 3.5, 6.2)
sds <- c(3.4, 0.9, 1.5, 1.4)
r <- matrix( c(
1.0, .57, -.4, .48,
.57, 1.0, -.61, .66,
-.4, -.61, 1.0, -.68,
.48, .66, -.68, 1.0), 4)
rownames(r) <- colnames(r) <- paste0('x', 1:4)
r regression multiple-regression regression-coefficients
marked as duplicate by whuber♦
StackExchange.ready(function() {
if (StackExchange.options.isMobile) return;
$('.dupe-hammer-message-hover:not(.hover-bound)').each(function() {
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');
$hover.hover(
function() {
$hover.showInfoMessage('', {
messageElement: $msg.clone().show(),
transient: false,
position: { my: 'bottom left', at: 'top center', offsetTop: -7 },
dismissable: false,
relativeToBody: true
});
},
function() {
StackExchange.helpers.removeMessages();
}
);
});
});
yesterday
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
You cannot run a truly "full" regression analysis with just these statistics, because you will not be able to construct residuals and perform regression diagnostics that depend on them. You are limited to making and testing parameter estimates.
– whuber♦
yesterday
add a comment |
up vote
1
down vote
favorite
up vote
1
down vote
favorite
This question already has an answer here:
Is there a way to use the covariance matrix to find coefficients for multiple regression?
1 answer
I was wondering how mathematically is it possible to run a full regression analysis between 3 predictors (x1 x2 x3
) and a dependent variable (y
) by only knowing the: Means, Ns, SDs, and the Correlations between all these 4 variables (without the original data)?
I highly appreciate an R
demonstration.
ns <- c(273, 273, 273, 273)
means <- c(15.4, 7.1, 3.5, 6.2)
sds <- c(3.4, 0.9, 1.5, 1.4)
r <- matrix( c(
1.0, .57, -.4, .48,
.57, 1.0, -.61, .66,
-.4, -.61, 1.0, -.68,
.48, .66, -.68, 1.0), 4)
rownames(r) <- colnames(r) <- paste0('x', 1:4)
r regression multiple-regression regression-coefficients
This question already has an answer here:
Is there a way to use the covariance matrix to find coefficients for multiple regression?
1 answer
I was wondering how mathematically is it possible to run a full regression analysis between 3 predictors (x1 x2 x3
) and a dependent variable (y
) by only knowing the: Means, Ns, SDs, and the Correlations between all these 4 variables (without the original data)?
I highly appreciate an R
demonstration.
ns <- c(273, 273, 273, 273)
means <- c(15.4, 7.1, 3.5, 6.2)
sds <- c(3.4, 0.9, 1.5, 1.4)
r <- matrix( c(
1.0, .57, -.4, .48,
.57, 1.0, -.61, .66,
-.4, -.61, 1.0, -.68,
.48, .66, -.68, 1.0), 4)
rownames(r) <- colnames(r) <- paste0('x', 1:4)
This question already has an answer here:
Is there a way to use the covariance matrix to find coefficients for multiple regression?
1 answer
r regression multiple-regression regression-coefficients
r regression multiple-regression regression-coefficients
edited yesterday
asked yesterday
rnorouzian
6691720
6691720
marked as duplicate by whuber♦
StackExchange.ready(function() {
if (StackExchange.options.isMobile) return;
$('.dupe-hammer-message-hover:not(.hover-bound)').each(function() {
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');
$hover.hover(
function() {
$hover.showInfoMessage('', {
messageElement: $msg.clone().show(),
transient: false,
position: { my: 'bottom left', at: 'top center', offsetTop: -7 },
dismissable: false,
relativeToBody: true
});
},
function() {
StackExchange.helpers.removeMessages();
}
);
});
});
yesterday
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
marked as duplicate by whuber♦
StackExchange.ready(function() {
if (StackExchange.options.isMobile) return;
$('.dupe-hammer-message-hover:not(.hover-bound)').each(function() {
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');
$hover.hover(
function() {
$hover.showInfoMessage('', {
messageElement: $msg.clone().show(),
transient: false,
position: { my: 'bottom left', at: 'top center', offsetTop: -7 },
dismissable: false,
relativeToBody: true
});
},
function() {
StackExchange.helpers.removeMessages();
}
);
});
});
yesterday
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
You cannot run a truly "full" regression analysis with just these statistics, because you will not be able to construct residuals and perform regression diagnostics that depend on them. You are limited to making and testing parameter estimates.
– whuber♦
yesterday
add a comment |
You cannot run a truly "full" regression analysis with just these statistics, because you will not be able to construct residuals and perform regression diagnostics that depend on them. You are limited to making and testing parameter estimates.
– whuber♦
yesterday
You cannot run a truly "full" regression analysis with just these statistics, because you will not be able to construct residuals and perform regression diagnostics that depend on them. You are limited to making and testing parameter estimates.
– whuber♦
yesterday
You cannot run a truly "full" regression analysis with just these statistics, because you will not be able to construct residuals and perform regression diagnostics that depend on them. You are limited to making and testing parameter estimates.
– whuber♦
yesterday
add a comment |
1 Answer
1
active
oldest
votes
up vote
4
down vote
- Based on that correlation matrix, you can estimate the standardized regression coefficients (excerpt intercept) by following. (suppose the first column is for y)
$$left(begin{matrix} 1.00 & -0.61 & 0.66\
-.61 & 1.00 & -0.68\
.66 & -.68 & 1.00end{matrix}right)^{-1}left(begin{matrix} 0.57\
-.40\
.48end{matrix}right)$$
Combining standard deviations you can convert standardized regression coefficients into general regression coefficients.
Using the information about means, you can get the estimate of intercept.
1
I do not know how to use R.
– user158565
yesterday
1
How does collinearity influence this?
– PascalVKooten
yesterday
Full collinearity ==> not exist of the inverse of that matrix in the answer, and the generalized inverse should be used and there are infinite number of estimates. Partial collinearity ==> the inverse of that matrix is unstable, i.e., the little change in X can result in tremendous change in estimate.
– user158565
yesterday
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
4
down vote
- Based on that correlation matrix, you can estimate the standardized regression coefficients (excerpt intercept) by following. (suppose the first column is for y)
$$left(begin{matrix} 1.00 & -0.61 & 0.66\
-.61 & 1.00 & -0.68\
.66 & -.68 & 1.00end{matrix}right)^{-1}left(begin{matrix} 0.57\
-.40\
.48end{matrix}right)$$
Combining standard deviations you can convert standardized regression coefficients into general regression coefficients.
Using the information about means, you can get the estimate of intercept.
1
I do not know how to use R.
– user158565
yesterday
1
How does collinearity influence this?
– PascalVKooten
yesterday
Full collinearity ==> not exist of the inverse of that matrix in the answer, and the generalized inverse should be used and there are infinite number of estimates. Partial collinearity ==> the inverse of that matrix is unstable, i.e., the little change in X can result in tremendous change in estimate.
– user158565
yesterday
add a comment |
up vote
4
down vote
- Based on that correlation matrix, you can estimate the standardized regression coefficients (excerpt intercept) by following. (suppose the first column is for y)
$$left(begin{matrix} 1.00 & -0.61 & 0.66\
-.61 & 1.00 & -0.68\
.66 & -.68 & 1.00end{matrix}right)^{-1}left(begin{matrix} 0.57\
-.40\
.48end{matrix}right)$$
Combining standard deviations you can convert standardized regression coefficients into general regression coefficients.
Using the information about means, you can get the estimate of intercept.
1
I do not know how to use R.
– user158565
yesterday
1
How does collinearity influence this?
– PascalVKooten
yesterday
Full collinearity ==> not exist of the inverse of that matrix in the answer, and the generalized inverse should be used and there are infinite number of estimates. Partial collinearity ==> the inverse of that matrix is unstable, i.e., the little change in X can result in tremendous change in estimate.
– user158565
yesterday
add a comment |
up vote
4
down vote
up vote
4
down vote
- Based on that correlation matrix, you can estimate the standardized regression coefficients (excerpt intercept) by following. (suppose the first column is for y)
$$left(begin{matrix} 1.00 & -0.61 & 0.66\
-.61 & 1.00 & -0.68\
.66 & -.68 & 1.00end{matrix}right)^{-1}left(begin{matrix} 0.57\
-.40\
.48end{matrix}right)$$
Combining standard deviations you can convert standardized regression coefficients into general regression coefficients.
Using the information about means, you can get the estimate of intercept.
- Based on that correlation matrix, you can estimate the standardized regression coefficients (excerpt intercept) by following. (suppose the first column is for y)
$$left(begin{matrix} 1.00 & -0.61 & 0.66\
-.61 & 1.00 & -0.68\
.66 & -.68 & 1.00end{matrix}right)^{-1}left(begin{matrix} 0.57\
-.40\
.48end{matrix}right)$$
Combining standard deviations you can convert standardized regression coefficients into general regression coefficients.
Using the information about means, you can get the estimate of intercept.
edited yesterday
answered yesterday
user158565
4,7741317
4,7741317
1
I do not know how to use R.
– user158565
yesterday
1
How does collinearity influence this?
– PascalVKooten
yesterday
Full collinearity ==> not exist of the inverse of that matrix in the answer, and the generalized inverse should be used and there are infinite number of estimates. Partial collinearity ==> the inverse of that matrix is unstable, i.e., the little change in X can result in tremendous change in estimate.
– user158565
yesterday
add a comment |
1
I do not know how to use R.
– user158565
yesterday
1
How does collinearity influence this?
– PascalVKooten
yesterday
Full collinearity ==> not exist of the inverse of that matrix in the answer, and the generalized inverse should be used and there are infinite number of estimates. Partial collinearity ==> the inverse of that matrix is unstable, i.e., the little change in X can result in tremendous change in estimate.
– user158565
yesterday
1
1
I do not know how to use R.
– user158565
yesterday
I do not know how to use R.
– user158565
yesterday
1
1
How does collinearity influence this?
– PascalVKooten
yesterday
How does collinearity influence this?
– PascalVKooten
yesterday
Full collinearity ==> not exist of the inverse of that matrix in the answer, and the generalized inverse should be used and there are infinite number of estimates. Partial collinearity ==> the inverse of that matrix is unstable, i.e., the little change in X can result in tremendous change in estimate.
– user158565
yesterday
Full collinearity ==> not exist of the inverse of that matrix in the answer, and the generalized inverse should be used and there are infinite number of estimates. Partial collinearity ==> the inverse of that matrix is unstable, i.e., the little change in X can result in tremendous change in estimate.
– user158565
yesterday
add a comment |
You cannot run a truly "full" regression analysis with just these statistics, because you will not be able to construct residuals and perform regression diagnostics that depend on them. You are limited to making and testing parameter estimates.
– whuber♦
yesterday