Delta functions/Probability
$begingroup$
Here in an answer they say
Now note that its perfectly reasonable to have a prior that's say 2 delta functions at p=0.23 and p=0.88. Combining this prior with a likelihood coming from an observation of an H or T results in some strange function class, which is valid as a posterior as well. As you can see from the above example, using conjugate priors has nice properties, where it might be easy to say build a sequential estimation algorithm that updates the belief about the probable value of p every time you get a new observation. This wouldn't be computationally very easy had you started off with a prior that was 2 delta functions.
Question:I do not follow why the two probabilities at $p=0.23$ and $p=0.88$ need not sum up to $1$?
statistics probability-distributions statistical-inference bayesian dirac-delta
$endgroup$
|
show 4 more comments
$begingroup$
Here in an answer they say
Now note that its perfectly reasonable to have a prior that's say 2 delta functions at p=0.23 and p=0.88. Combining this prior with a likelihood coming from an observation of an H or T results in some strange function class, which is valid as a posterior as well. As you can see from the above example, using conjugate priors has nice properties, where it might be easy to say build a sequential estimation algorithm that updates the belief about the probable value of p every time you get a new observation. This wouldn't be computationally very easy had you started off with a prior that was 2 delta functions.
Question:I do not follow why the two probabilities at $p=0.23$ and $p=0.88$ need not sum up to $1$?
statistics probability-distributions statistical-inference bayesian dirac-delta
$endgroup$
$begingroup$
Just reading what is written, I assume the author is saying that the prior estimate of the probability of getting $H$ is that it is $p=.23$ or $p=.88$ with equal probability. Not sure why such odd numbers were used. Why not ask the writer directly?
$endgroup$
– lulu
Dec 28 '18 at 21:33
$begingroup$
I've asked him some more questions but the author doesn't seem to have read it. The oddnes of the numbers would be OK, but when standard delta functions are used it should sum up to $1$. But I'm fine with your answer "with equal probability".
$endgroup$
– user122424
Dec 28 '18 at 21:44
$begingroup$
I haven't read all the details of the question, so I wouldn't swear to the "equal probability". But it seems clear that the author is imagining that our prior is that $p_H$ is either $.23$ or $.88$. And, yes, the probabilities of those two cases would be expected to add to $1$.
$endgroup$
– lulu
Dec 28 '18 at 21:50
$begingroup$
Even if some slight modifications of the delta functions involved would be made it is necessary that they sum up to 1?
$endgroup$
– user122424
Dec 28 '18 at 21:58
$begingroup$
Modification? Well, they are multiplied by constants. Both $frac 12$ in the equal probability case.
$endgroup$
– lulu
Dec 28 '18 at 21:59
|
show 4 more comments
$begingroup$
Here in an answer they say
Now note that its perfectly reasonable to have a prior that's say 2 delta functions at p=0.23 and p=0.88. Combining this prior with a likelihood coming from an observation of an H or T results in some strange function class, which is valid as a posterior as well. As you can see from the above example, using conjugate priors has nice properties, where it might be easy to say build a sequential estimation algorithm that updates the belief about the probable value of p every time you get a new observation. This wouldn't be computationally very easy had you started off with a prior that was 2 delta functions.
Question:I do not follow why the two probabilities at $p=0.23$ and $p=0.88$ need not sum up to $1$?
statistics probability-distributions statistical-inference bayesian dirac-delta
$endgroup$
Here in an answer they say
Now note that its perfectly reasonable to have a prior that's say 2 delta functions at p=0.23 and p=0.88. Combining this prior with a likelihood coming from an observation of an H or T results in some strange function class, which is valid as a posterior as well. As you can see from the above example, using conjugate priors has nice properties, where it might be easy to say build a sequential estimation algorithm that updates the belief about the probable value of p every time you get a new observation. This wouldn't be computationally very easy had you started off with a prior that was 2 delta functions.
Question:I do not follow why the two probabilities at $p=0.23$ and $p=0.88$ need not sum up to $1$?
statistics probability-distributions statistical-inference bayesian dirac-delta
statistics probability-distributions statistical-inference bayesian dirac-delta
edited Dec 29 '18 at 1:14
Henry
100k481168
100k481168
asked Dec 28 '18 at 21:30
user122424user122424
1,1352716
1,1352716
$begingroup$
Just reading what is written, I assume the author is saying that the prior estimate of the probability of getting $H$ is that it is $p=.23$ or $p=.88$ with equal probability. Not sure why such odd numbers were used. Why not ask the writer directly?
$endgroup$
– lulu
Dec 28 '18 at 21:33
$begingroup$
I've asked him some more questions but the author doesn't seem to have read it. The oddnes of the numbers would be OK, but when standard delta functions are used it should sum up to $1$. But I'm fine with your answer "with equal probability".
$endgroup$
– user122424
Dec 28 '18 at 21:44
$begingroup$
I haven't read all the details of the question, so I wouldn't swear to the "equal probability". But it seems clear that the author is imagining that our prior is that $p_H$ is either $.23$ or $.88$. And, yes, the probabilities of those two cases would be expected to add to $1$.
$endgroup$
– lulu
Dec 28 '18 at 21:50
$begingroup$
Even if some slight modifications of the delta functions involved would be made it is necessary that they sum up to 1?
$endgroup$
– user122424
Dec 28 '18 at 21:58
$begingroup$
Modification? Well, they are multiplied by constants. Both $frac 12$ in the equal probability case.
$endgroup$
– lulu
Dec 28 '18 at 21:59
|
show 4 more comments
$begingroup$
Just reading what is written, I assume the author is saying that the prior estimate of the probability of getting $H$ is that it is $p=.23$ or $p=.88$ with equal probability. Not sure why such odd numbers were used. Why not ask the writer directly?
$endgroup$
– lulu
Dec 28 '18 at 21:33
$begingroup$
I've asked him some more questions but the author doesn't seem to have read it. The oddnes of the numbers would be OK, but when standard delta functions are used it should sum up to $1$. But I'm fine with your answer "with equal probability".
$endgroup$
– user122424
Dec 28 '18 at 21:44
$begingroup$
I haven't read all the details of the question, so I wouldn't swear to the "equal probability". But it seems clear that the author is imagining that our prior is that $p_H$ is either $.23$ or $.88$. And, yes, the probabilities of those two cases would be expected to add to $1$.
$endgroup$
– lulu
Dec 28 '18 at 21:50
$begingroup$
Even if some slight modifications of the delta functions involved would be made it is necessary that they sum up to 1?
$endgroup$
– user122424
Dec 28 '18 at 21:58
$begingroup$
Modification? Well, they are multiplied by constants. Both $frac 12$ in the equal probability case.
$endgroup$
– lulu
Dec 28 '18 at 21:59
$begingroup$
Just reading what is written, I assume the author is saying that the prior estimate of the probability of getting $H$ is that it is $p=.23$ or $p=.88$ with equal probability. Not sure why such odd numbers were used. Why not ask the writer directly?
$endgroup$
– lulu
Dec 28 '18 at 21:33
$begingroup$
Just reading what is written, I assume the author is saying that the prior estimate of the probability of getting $H$ is that it is $p=.23$ or $p=.88$ with equal probability. Not sure why such odd numbers were used. Why not ask the writer directly?
$endgroup$
– lulu
Dec 28 '18 at 21:33
$begingroup$
I've asked him some more questions but the author doesn't seem to have read it. The oddnes of the numbers would be OK, but when standard delta functions are used it should sum up to $1$. But I'm fine with your answer "with equal probability".
$endgroup$
– user122424
Dec 28 '18 at 21:44
$begingroup$
I've asked him some more questions but the author doesn't seem to have read it. The oddnes of the numbers would be OK, but when standard delta functions are used it should sum up to $1$. But I'm fine with your answer "with equal probability".
$endgroup$
– user122424
Dec 28 '18 at 21:44
$begingroup$
I haven't read all the details of the question, so I wouldn't swear to the "equal probability". But it seems clear that the author is imagining that our prior is that $p_H$ is either $.23$ or $.88$. And, yes, the probabilities of those two cases would be expected to add to $1$.
$endgroup$
– lulu
Dec 28 '18 at 21:50
$begingroup$
I haven't read all the details of the question, so I wouldn't swear to the "equal probability". But it seems clear that the author is imagining that our prior is that $p_H$ is either $.23$ or $.88$. And, yes, the probabilities of those two cases would be expected to add to $1$.
$endgroup$
– lulu
Dec 28 '18 at 21:50
$begingroup$
Even if some slight modifications of the delta functions involved would be made it is necessary that they sum up to 1?
$endgroup$
– user122424
Dec 28 '18 at 21:58
$begingroup$
Even if some slight modifications of the delta functions involved would be made it is necessary that they sum up to 1?
$endgroup$
– user122424
Dec 28 '18 at 21:58
$begingroup$
Modification? Well, they are multiplied by constants. Both $frac 12$ in the equal probability case.
$endgroup$
– lulu
Dec 28 '18 at 21:59
$begingroup$
Modification? Well, they are multiplied by constants. Both $frac 12$ in the equal probability case.
$endgroup$
– lulu
Dec 28 '18 at 21:59
|
show 4 more comments
1 Answer
1
active
oldest
votes
$begingroup$
Suppose you start convinced that $p=0.23$ or $p=0.88$ and your prior probability distribution is that the first is the case with probability $q$ and the second with probability $1-q$; these have $q+(1-q)=1$. Perhaps you have two precision-made biased coins which look identical, and you do not know which one is in your hand
You then experiment and get $h$ successes (heads) and $t$ failures (tails). Your posterior probability that in fact $p=0.23$ should now be $$dfrac{q ,0.23^h , 0.77^t}{q ,0.23^h , 0.77^t +(1-q) ,0.88^h , 0.12^t}$$ and the posterior probability for $p=0.88$ is similarly $frac{(1-q) ,0.88^h , 0.12^t}{q ,0.23^h , 0.77^t +(1-q) ,0.88^h , 0.12^t}$ with these two expressions adding up to $1$. I would call this computationally easy
If you thought originally that each possibility was equally likely so your prior had $q=frac12$, then the $q$s and $(1-q)$s can be cancelled, slightly simplifying the expression. For example if you had
$q=frac12$ and $h=11$ and $t=9$ you would get a posterior probability for $p=0.23$ of about $0.8776$ and a posterior probability for $p=0.88$ of about $0.1224$
$q=frac12$ and $h=12$ and $t=8$ you would get a posterior probability for $p=0.23$ of about $0.2260$ and a posterior probability for $p=0.88$ of about $0.7740$
showing how sensitive the posterior is to the experimental results. Both these outcomes would be rather unlikely if you were correct to start convinced that $p=0.23$ or $p=0.88$ were the only possible realities
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3055309%2fdelta-functions-probability%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Suppose you start convinced that $p=0.23$ or $p=0.88$ and your prior probability distribution is that the first is the case with probability $q$ and the second with probability $1-q$; these have $q+(1-q)=1$. Perhaps you have two precision-made biased coins which look identical, and you do not know which one is in your hand
You then experiment and get $h$ successes (heads) and $t$ failures (tails). Your posterior probability that in fact $p=0.23$ should now be $$dfrac{q ,0.23^h , 0.77^t}{q ,0.23^h , 0.77^t +(1-q) ,0.88^h , 0.12^t}$$ and the posterior probability for $p=0.88$ is similarly $frac{(1-q) ,0.88^h , 0.12^t}{q ,0.23^h , 0.77^t +(1-q) ,0.88^h , 0.12^t}$ with these two expressions adding up to $1$. I would call this computationally easy
If you thought originally that each possibility was equally likely so your prior had $q=frac12$, then the $q$s and $(1-q)$s can be cancelled, slightly simplifying the expression. For example if you had
$q=frac12$ and $h=11$ and $t=9$ you would get a posterior probability for $p=0.23$ of about $0.8776$ and a posterior probability for $p=0.88$ of about $0.1224$
$q=frac12$ and $h=12$ and $t=8$ you would get a posterior probability for $p=0.23$ of about $0.2260$ and a posterior probability for $p=0.88$ of about $0.7740$
showing how sensitive the posterior is to the experimental results. Both these outcomes would be rather unlikely if you were correct to start convinced that $p=0.23$ or $p=0.88$ were the only possible realities
$endgroup$
add a comment |
$begingroup$
Suppose you start convinced that $p=0.23$ or $p=0.88$ and your prior probability distribution is that the first is the case with probability $q$ and the second with probability $1-q$; these have $q+(1-q)=1$. Perhaps you have two precision-made biased coins which look identical, and you do not know which one is in your hand
You then experiment and get $h$ successes (heads) and $t$ failures (tails). Your posterior probability that in fact $p=0.23$ should now be $$dfrac{q ,0.23^h , 0.77^t}{q ,0.23^h , 0.77^t +(1-q) ,0.88^h , 0.12^t}$$ and the posterior probability for $p=0.88$ is similarly $frac{(1-q) ,0.88^h , 0.12^t}{q ,0.23^h , 0.77^t +(1-q) ,0.88^h , 0.12^t}$ with these two expressions adding up to $1$. I would call this computationally easy
If you thought originally that each possibility was equally likely so your prior had $q=frac12$, then the $q$s and $(1-q)$s can be cancelled, slightly simplifying the expression. For example if you had
$q=frac12$ and $h=11$ and $t=9$ you would get a posterior probability for $p=0.23$ of about $0.8776$ and a posterior probability for $p=0.88$ of about $0.1224$
$q=frac12$ and $h=12$ and $t=8$ you would get a posterior probability for $p=0.23$ of about $0.2260$ and a posterior probability for $p=0.88$ of about $0.7740$
showing how sensitive the posterior is to the experimental results. Both these outcomes would be rather unlikely if you were correct to start convinced that $p=0.23$ or $p=0.88$ were the only possible realities
$endgroup$
add a comment |
$begingroup$
Suppose you start convinced that $p=0.23$ or $p=0.88$ and your prior probability distribution is that the first is the case with probability $q$ and the second with probability $1-q$; these have $q+(1-q)=1$. Perhaps you have two precision-made biased coins which look identical, and you do not know which one is in your hand
You then experiment and get $h$ successes (heads) and $t$ failures (tails). Your posterior probability that in fact $p=0.23$ should now be $$dfrac{q ,0.23^h , 0.77^t}{q ,0.23^h , 0.77^t +(1-q) ,0.88^h , 0.12^t}$$ and the posterior probability for $p=0.88$ is similarly $frac{(1-q) ,0.88^h , 0.12^t}{q ,0.23^h , 0.77^t +(1-q) ,0.88^h , 0.12^t}$ with these two expressions adding up to $1$. I would call this computationally easy
If you thought originally that each possibility was equally likely so your prior had $q=frac12$, then the $q$s and $(1-q)$s can be cancelled, slightly simplifying the expression. For example if you had
$q=frac12$ and $h=11$ and $t=9$ you would get a posterior probability for $p=0.23$ of about $0.8776$ and a posterior probability for $p=0.88$ of about $0.1224$
$q=frac12$ and $h=12$ and $t=8$ you would get a posterior probability for $p=0.23$ of about $0.2260$ and a posterior probability for $p=0.88$ of about $0.7740$
showing how sensitive the posterior is to the experimental results. Both these outcomes would be rather unlikely if you were correct to start convinced that $p=0.23$ or $p=0.88$ were the only possible realities
$endgroup$
Suppose you start convinced that $p=0.23$ or $p=0.88$ and your prior probability distribution is that the first is the case with probability $q$ and the second with probability $1-q$; these have $q+(1-q)=1$. Perhaps you have two precision-made biased coins which look identical, and you do not know which one is in your hand
You then experiment and get $h$ successes (heads) and $t$ failures (tails). Your posterior probability that in fact $p=0.23$ should now be $$dfrac{q ,0.23^h , 0.77^t}{q ,0.23^h , 0.77^t +(1-q) ,0.88^h , 0.12^t}$$ and the posterior probability for $p=0.88$ is similarly $frac{(1-q) ,0.88^h , 0.12^t}{q ,0.23^h , 0.77^t +(1-q) ,0.88^h , 0.12^t}$ with these two expressions adding up to $1$. I would call this computationally easy
If you thought originally that each possibility was equally likely so your prior had $q=frac12$, then the $q$s and $(1-q)$s can be cancelled, slightly simplifying the expression. For example if you had
$q=frac12$ and $h=11$ and $t=9$ you would get a posterior probability for $p=0.23$ of about $0.8776$ and a posterior probability for $p=0.88$ of about $0.1224$
$q=frac12$ and $h=12$ and $t=8$ you would get a posterior probability for $p=0.23$ of about $0.2260$ and a posterior probability for $p=0.88$ of about $0.7740$
showing how sensitive the posterior is to the experimental results. Both these outcomes would be rather unlikely if you were correct to start convinced that $p=0.23$ or $p=0.88$ were the only possible realities
answered Dec 29 '18 at 1:12
HenryHenry
100k481168
100k481168
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3055309%2fdelta-functions-probability%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Just reading what is written, I assume the author is saying that the prior estimate of the probability of getting $H$ is that it is $p=.23$ or $p=.88$ with equal probability. Not sure why such odd numbers were used. Why not ask the writer directly?
$endgroup$
– lulu
Dec 28 '18 at 21:33
$begingroup$
I've asked him some more questions but the author doesn't seem to have read it. The oddnes of the numbers would be OK, but when standard delta functions are used it should sum up to $1$. But I'm fine with your answer "with equal probability".
$endgroup$
– user122424
Dec 28 '18 at 21:44
$begingroup$
I haven't read all the details of the question, so I wouldn't swear to the "equal probability". But it seems clear that the author is imagining that our prior is that $p_H$ is either $.23$ or $.88$. And, yes, the probabilities of those two cases would be expected to add to $1$.
$endgroup$
– lulu
Dec 28 '18 at 21:50
$begingroup$
Even if some slight modifications of the delta functions involved would be made it is necessary that they sum up to 1?
$endgroup$
– user122424
Dec 28 '18 at 21:58
$begingroup$
Modification? Well, they are multiplied by constants. Both $frac 12$ in the equal probability case.
$endgroup$
– lulu
Dec 28 '18 at 21:59