$P(x>0|x+y>0);$ of two independently distributed Gaussian distributions?
$begingroup$
I am given two independently distributed Gaussian distributions, $X$ and $Y$, with unknown means and variances. You randomly choose an $x$ and $y$ from these two distributions respectively. Given that $x+y>0$, what is the probability that $x>0$ in terms of the unknown parameters?
Suppose $Xsim N(mu_1, sigma_1^2)$ and $Ysim N(mu_2, sigma_2^2)$.
I know that $X+Ysim N(mu_1+mu_2, sigma_1^2+sigma_2^2)$. I considered using Bayes’ theorem.
$$P(x>0|x+y>0)=frac{P(x>0)P(x+y>0|x>0)}{P(x+y>0)}.$$
Since I know the distributions of $X$ and $X+Y$, I know that the forms of $P(x>0)$ and $P(x+y>0)$ are that of integrating the Gaussian. However, I am unsure of the form of $P(x+y>0|x>0)$. I feel that this is much easier to calculate when observing the distribution of $X+Y$ on a 2D-plane, but I am not exactly sure how to write it in terms of the unknowns. Could someone please shed some insight on this? Thanks!
Edit: this was meant to be a conceptual question I believe, and should be left as a combination of integrals.
statistics probability-distributions normal-distribution conditional-probability bayes-theorem
$endgroup$
|
show 7 more comments
$begingroup$
I am given two independently distributed Gaussian distributions, $X$ and $Y$, with unknown means and variances. You randomly choose an $x$ and $y$ from these two distributions respectively. Given that $x+y>0$, what is the probability that $x>0$ in terms of the unknown parameters?
Suppose $Xsim N(mu_1, sigma_1^2)$ and $Ysim N(mu_2, sigma_2^2)$.
I know that $X+Ysim N(mu_1+mu_2, sigma_1^2+sigma_2^2)$. I considered using Bayes’ theorem.
$$P(x>0|x+y>0)=frac{P(x>0)P(x+y>0|x>0)}{P(x+y>0)}.$$
Since I know the distributions of $X$ and $X+Y$, I know that the forms of $P(x>0)$ and $P(x+y>0)$ are that of integrating the Gaussian. However, I am unsure of the form of $P(x+y>0|x>0)$. I feel that this is much easier to calculate when observing the distribution of $X+Y$ on a 2D-plane, but I am not exactly sure how to write it in terms of the unknowns. Could someone please shed some insight on this? Thanks!
Edit: this was meant to be a conceptual question I believe, and should be left as a combination of integrals.
statistics probability-distributions normal-distribution conditional-probability bayes-theorem
$endgroup$
$begingroup$
One insight is that if these are supposed to be i.i.d., you have four parameters where you should have only two, since $mu_1=mu_2$ and $sigma_1=sigma_2$.
$endgroup$
– C Monsour
Dec 17 '18 at 0:10
$begingroup$
@CMonsour oh gosh sorry it should just be independently distributed, not i.i.d!
$endgroup$
– user107224
Dec 17 '18 at 0:12
1
$begingroup$
Even covariances are not enough in general, you are rather after formulas similar to $$P(X+Y>0,X>0)=int_0^inftyint_{-x}^infty f_Y(y)dyf_X(x)dx$$ But are you sure you need to solve the question for any values of the $mu_i$s and $sigma^2_i$s? 'Cause the answer shall be a nightmare to write down, even if rather simple conceptually.
$endgroup$
– Did
Dec 17 '18 at 0:27
1
$begingroup$
Ah, then you already have $P(X+Y>0,X>0)$, and $$P(X>0)=int_0^infty f_X(x)dx$$ and since you know $f_X$ and $f_Y$, you are done.
$endgroup$
– Did
Dec 17 '18 at 0:33
2
$begingroup$
Because ${X+Y>0,X>0}={X>0,Y>-X}$.
$endgroup$
– Did
Dec 17 '18 at 0:37
|
show 7 more comments
$begingroup$
I am given two independently distributed Gaussian distributions, $X$ and $Y$, with unknown means and variances. You randomly choose an $x$ and $y$ from these two distributions respectively. Given that $x+y>0$, what is the probability that $x>0$ in terms of the unknown parameters?
Suppose $Xsim N(mu_1, sigma_1^2)$ and $Ysim N(mu_2, sigma_2^2)$.
I know that $X+Ysim N(mu_1+mu_2, sigma_1^2+sigma_2^2)$. I considered using Bayes’ theorem.
$$P(x>0|x+y>0)=frac{P(x>0)P(x+y>0|x>0)}{P(x+y>0)}.$$
Since I know the distributions of $X$ and $X+Y$, I know that the forms of $P(x>0)$ and $P(x+y>0)$ are that of integrating the Gaussian. However, I am unsure of the form of $P(x+y>0|x>0)$. I feel that this is much easier to calculate when observing the distribution of $X+Y$ on a 2D-plane, but I am not exactly sure how to write it in terms of the unknowns. Could someone please shed some insight on this? Thanks!
Edit: this was meant to be a conceptual question I believe, and should be left as a combination of integrals.
statistics probability-distributions normal-distribution conditional-probability bayes-theorem
$endgroup$
I am given two independently distributed Gaussian distributions, $X$ and $Y$, with unknown means and variances. You randomly choose an $x$ and $y$ from these two distributions respectively. Given that $x+y>0$, what is the probability that $x>0$ in terms of the unknown parameters?
Suppose $Xsim N(mu_1, sigma_1^2)$ and $Ysim N(mu_2, sigma_2^2)$.
I know that $X+Ysim N(mu_1+mu_2, sigma_1^2+sigma_2^2)$. I considered using Bayes’ theorem.
$$P(x>0|x+y>0)=frac{P(x>0)P(x+y>0|x>0)}{P(x+y>0)}.$$
Since I know the distributions of $X$ and $X+Y$, I know that the forms of $P(x>0)$ and $P(x+y>0)$ are that of integrating the Gaussian. However, I am unsure of the form of $P(x+y>0|x>0)$. I feel that this is much easier to calculate when observing the distribution of $X+Y$ on a 2D-plane, but I am not exactly sure how to write it in terms of the unknowns. Could someone please shed some insight on this? Thanks!
Edit: this was meant to be a conceptual question I believe, and should be left as a combination of integrals.
statistics probability-distributions normal-distribution conditional-probability bayes-theorem
statistics probability-distributions normal-distribution conditional-probability bayes-theorem
edited Dec 17 '18 at 3:27
user376343
3,7883827
3,7883827
asked Dec 17 '18 at 0:07
user107224user107224
442314
442314
$begingroup$
One insight is that if these are supposed to be i.i.d., you have four parameters where you should have only two, since $mu_1=mu_2$ and $sigma_1=sigma_2$.
$endgroup$
– C Monsour
Dec 17 '18 at 0:10
$begingroup$
@CMonsour oh gosh sorry it should just be independently distributed, not i.i.d!
$endgroup$
– user107224
Dec 17 '18 at 0:12
1
$begingroup$
Even covariances are not enough in general, you are rather after formulas similar to $$P(X+Y>0,X>0)=int_0^inftyint_{-x}^infty f_Y(y)dyf_X(x)dx$$ But are you sure you need to solve the question for any values of the $mu_i$s and $sigma^2_i$s? 'Cause the answer shall be a nightmare to write down, even if rather simple conceptually.
$endgroup$
– Did
Dec 17 '18 at 0:27
1
$begingroup$
Ah, then you already have $P(X+Y>0,X>0)$, and $$P(X>0)=int_0^infty f_X(x)dx$$ and since you know $f_X$ and $f_Y$, you are done.
$endgroup$
– Did
Dec 17 '18 at 0:33
2
$begingroup$
Because ${X+Y>0,X>0}={X>0,Y>-X}$.
$endgroup$
– Did
Dec 17 '18 at 0:37
|
show 7 more comments
$begingroup$
One insight is that if these are supposed to be i.i.d., you have four parameters where you should have only two, since $mu_1=mu_2$ and $sigma_1=sigma_2$.
$endgroup$
– C Monsour
Dec 17 '18 at 0:10
$begingroup$
@CMonsour oh gosh sorry it should just be independently distributed, not i.i.d!
$endgroup$
– user107224
Dec 17 '18 at 0:12
1
$begingroup$
Even covariances are not enough in general, you are rather after formulas similar to $$P(X+Y>0,X>0)=int_0^inftyint_{-x}^infty f_Y(y)dyf_X(x)dx$$ But are you sure you need to solve the question for any values of the $mu_i$s and $sigma^2_i$s? 'Cause the answer shall be a nightmare to write down, even if rather simple conceptually.
$endgroup$
– Did
Dec 17 '18 at 0:27
1
$begingroup$
Ah, then you already have $P(X+Y>0,X>0)$, and $$P(X>0)=int_0^infty f_X(x)dx$$ and since you know $f_X$ and $f_Y$, you are done.
$endgroup$
– Did
Dec 17 '18 at 0:33
2
$begingroup$
Because ${X+Y>0,X>0}={X>0,Y>-X}$.
$endgroup$
– Did
Dec 17 '18 at 0:37
$begingroup$
One insight is that if these are supposed to be i.i.d., you have four parameters where you should have only two, since $mu_1=mu_2$ and $sigma_1=sigma_2$.
$endgroup$
– C Monsour
Dec 17 '18 at 0:10
$begingroup$
One insight is that if these are supposed to be i.i.d., you have four parameters where you should have only two, since $mu_1=mu_2$ and $sigma_1=sigma_2$.
$endgroup$
– C Monsour
Dec 17 '18 at 0:10
$begingroup$
@CMonsour oh gosh sorry it should just be independently distributed, not i.i.d!
$endgroup$
– user107224
Dec 17 '18 at 0:12
$begingroup$
@CMonsour oh gosh sorry it should just be independently distributed, not i.i.d!
$endgroup$
– user107224
Dec 17 '18 at 0:12
1
1
$begingroup$
Even covariances are not enough in general, you are rather after formulas similar to $$P(X+Y>0,X>0)=int_0^inftyint_{-x}^infty f_Y(y)dyf_X(x)dx$$ But are you sure you need to solve the question for any values of the $mu_i$s and $sigma^2_i$s? 'Cause the answer shall be a nightmare to write down, even if rather simple conceptually.
$endgroup$
– Did
Dec 17 '18 at 0:27
$begingroup$
Even covariances are not enough in general, you are rather after formulas similar to $$P(X+Y>0,X>0)=int_0^inftyint_{-x}^infty f_Y(y)dyf_X(x)dx$$ But are you sure you need to solve the question for any values of the $mu_i$s and $sigma^2_i$s? 'Cause the answer shall be a nightmare to write down, even if rather simple conceptually.
$endgroup$
– Did
Dec 17 '18 at 0:27
1
1
$begingroup$
Ah, then you already have $P(X+Y>0,X>0)$, and $$P(X>0)=int_0^infty f_X(x)dx$$ and since you know $f_X$ and $f_Y$, you are done.
$endgroup$
– Did
Dec 17 '18 at 0:33
$begingroup$
Ah, then you already have $P(X+Y>0,X>0)$, and $$P(X>0)=int_0^infty f_X(x)dx$$ and since you know $f_X$ and $f_Y$, you are done.
$endgroup$
– Did
Dec 17 '18 at 0:33
2
2
$begingroup$
Because ${X+Y>0,X>0}={X>0,Y>-X}$.
$endgroup$
– Did
Dec 17 '18 at 0:37
$begingroup$
Because ${X+Y>0,X>0}={X>0,Y>-X}$.
$endgroup$
– Did
Dec 17 '18 at 0:37
|
show 7 more comments
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3043372%2fpx0xy0-of-two-independently-distributed-gaussian-distributions%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3043372%2fpx0xy0-of-two-independently-distributed-gaussian-distributions%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
One insight is that if these are supposed to be i.i.d., you have four parameters where you should have only two, since $mu_1=mu_2$ and $sigma_1=sigma_2$.
$endgroup$
– C Monsour
Dec 17 '18 at 0:10
$begingroup$
@CMonsour oh gosh sorry it should just be independently distributed, not i.i.d!
$endgroup$
– user107224
Dec 17 '18 at 0:12
1
$begingroup$
Even covariances are not enough in general, you are rather after formulas similar to $$P(X+Y>0,X>0)=int_0^inftyint_{-x}^infty f_Y(y)dyf_X(x)dx$$ But are you sure you need to solve the question for any values of the $mu_i$s and $sigma^2_i$s? 'Cause the answer shall be a nightmare to write down, even if rather simple conceptually.
$endgroup$
– Did
Dec 17 '18 at 0:27
1
$begingroup$
Ah, then you already have $P(X+Y>0,X>0)$, and $$P(X>0)=int_0^infty f_X(x)dx$$ and since you know $f_X$ and $f_Y$, you are done.
$endgroup$
– Did
Dec 17 '18 at 0:33
2
$begingroup$
Because ${X+Y>0,X>0}={X>0,Y>-X}$.
$endgroup$
– Did
Dec 17 '18 at 0:37