Limit distribution of infinite sum of Bernoulli random variables
up vote
3
down vote
favorite
I know that the finite sum of Bernoulli i.i.d. random variables is a binomial distribution, but what is the distribution of $$lim_{n to infty}sum_{k=1}^{n} frac{x_k}{2^k}$$ where $x_k$ is a Bernoulli random variable with parameter $frac12$?
probability probability-theory probability-distributions
add a comment |
up vote
3
down vote
favorite
I know that the finite sum of Bernoulli i.i.d. random variables is a binomial distribution, but what is the distribution of $$lim_{n to infty}sum_{k=1}^{n} frac{x_k}{2^k}$$ where $x_k$ is a Bernoulli random variable with parameter $frac12$?
probability probability-theory probability-distributions
Are the random variables $x_k$ supposed to be independent?
– r.e.s.
May 6 '15 at 0:00
1
It should be noted that the limit is suggestively written in the form $0.x_1x_2x_3...$ as the base-$2$ representation of a random real number in the unit interval.
– r.e.s.
May 6 '15 at 0:05
1
A related question is "If $x$ is uniformly distributed on the unit interval, then what is the distribution of the binary digits of $x$?"
– r.e.s.
May 6 '15 at 0:14
add a comment |
up vote
3
down vote
favorite
up vote
3
down vote
favorite
I know that the finite sum of Bernoulli i.i.d. random variables is a binomial distribution, but what is the distribution of $$lim_{n to infty}sum_{k=1}^{n} frac{x_k}{2^k}$$ where $x_k$ is a Bernoulli random variable with parameter $frac12$?
probability probability-theory probability-distributions
I know that the finite sum of Bernoulli i.i.d. random variables is a binomial distribution, but what is the distribution of $$lim_{n to infty}sum_{k=1}^{n} frac{x_k}{2^k}$$ where $x_k$ is a Bernoulli random variable with parameter $frac12$?
probability probability-theory probability-distributions
probability probability-theory probability-distributions
edited Feb 25 '16 at 9:41
tonytonov
1157
1157
asked May 5 '15 at 22:43
user165576
1225
1225
Are the random variables $x_k$ supposed to be independent?
– r.e.s.
May 6 '15 at 0:00
1
It should be noted that the limit is suggestively written in the form $0.x_1x_2x_3...$ as the base-$2$ representation of a random real number in the unit interval.
– r.e.s.
May 6 '15 at 0:05
1
A related question is "If $x$ is uniformly distributed on the unit interval, then what is the distribution of the binary digits of $x$?"
– r.e.s.
May 6 '15 at 0:14
add a comment |
Are the random variables $x_k$ supposed to be independent?
– r.e.s.
May 6 '15 at 0:00
1
It should be noted that the limit is suggestively written in the form $0.x_1x_2x_3...$ as the base-$2$ representation of a random real number in the unit interval.
– r.e.s.
May 6 '15 at 0:05
1
A related question is "If $x$ is uniformly distributed on the unit interval, then what is the distribution of the binary digits of $x$?"
– r.e.s.
May 6 '15 at 0:14
Are the random variables $x_k$ supposed to be independent?
– r.e.s.
May 6 '15 at 0:00
Are the random variables $x_k$ supposed to be independent?
– r.e.s.
May 6 '15 at 0:00
1
1
It should be noted that the limit is suggestively written in the form $0.x_1x_2x_3...$ as the base-$2$ representation of a random real number in the unit interval.
– r.e.s.
May 6 '15 at 0:05
It should be noted that the limit is suggestively written in the form $0.x_1x_2x_3...$ as the base-$2$ representation of a random real number in the unit interval.
– r.e.s.
May 6 '15 at 0:05
1
1
A related question is "If $x$ is uniformly distributed on the unit interval, then what is the distribution of the binary digits of $x$?"
– r.e.s.
May 6 '15 at 0:14
A related question is "If $x$ is uniformly distributed on the unit interval, then what is the distribution of the binary digits of $x$?"
– r.e.s.
May 6 '15 at 0:14
add a comment |
2 Answers
2
active
oldest
votes
up vote
7
down vote
An approach using generating functions is possible.
Define $Y_k = X_k/2^k$, $k = 1, 2, ldots$. Then $$M_{Y_k}(t) = operatorname{E}[e^{tY_k}] = e^0 Pr[Y_k = 0] + e^{t/2^k} Pr[Y_k = 2^{-k}] = frac{e^{t/2^k} + 1}{2}.$$ Then defining $$S_n = sum_{k=1}^n Y_k,$$ we find that the MGF for $S_n$ is $$begin{align*} M_{S_n}(t) &= operatorname{E}left[prod_{k=1}^n e^{tY_k}right] = prod_{k=1}^n M_{Y_k}(t) \ &= 2^{-n} prod_{k=1}^n (e^{t/2^k} + 1) \ &= frac{2^{-n}}{e^{t/2^n} - 1} (e^{t/2^n} - 1)(e^{t/2^n} + 1) prod_{k=1}^{n-1} (e^{t/2^k} + 1) \ &= frac{2^{-n}}{e^{t/2^n} - 1} (e^{t/2^{n-1}} - 1) prod_{k=1}^{n-1} (e^{t/2^k} + 1) \ &= frac{e^t - 1}{2^n(e^{t/2^n} - 1)} .end{align*}$$ Taking the limit as $n to infty$, we easily get $$M_{S_infty}(t) = frac{e^t-1}{t},$$ which is the MGF of a $operatorname{Uniform}(0,1)$ distribution.
Great answer, and the proof can also be generalized for arbitrary base $b$. Do you know any source I can link to in my paper?
– tonytonov
Feb 25 '16 at 9:26
add a comment |
up vote
2
down vote
Let $$S_n = sum_{k=1}^n 2^{-k}X_k.$$ As $$0leqslant S_nleqslant sum_{k=1}^n 2^{-k} = 1$$ almost surely for all $n$, this inequality holds in the limit, so by dominated convergence,
$$
begin{align*}
mathbb Eleft[lim_{ntoinfty}S_nright] &= lim_{ntoinfty}mathbb E[S_n]\
&= lim_{ntoinfty}mathbb Eleft[ sum_{k=1}^n 2^{-k}X_kright]\
&= lim_{ntoinfty}sum_{k=1}^n 2^{-k}mathbb E[X_k]\
&= lim_{ntoinfty}sum_{k=1}^n 2^{-k+1}\
&= frac12sum_{k=1}^infty 2^{-k}\
&= frac12.
end{align*}
$$
However, the limiting distribution is actually continuous. For each $n$, $S_n$ is uniformly distributed over
$$E_n=left{sum_{kin S}2^{-k} : Ssubset{1,2,ldots,n}right}. $$
Given an $n$, and some $omegain E_n$, we have for $mgeqslant n$
$$mathbb P(S_m = omega)=2^{-m}.$$
Hence
$$lim_{mtoinfty} P(S_m = omega)=0.$$
@gmath suggests in this answer that the limiting distribution is actually uniform on the interval $(0,1)$.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1268881%2flimit-distribution-of-infinite-sum-of-bernoulli-random-variables%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
7
down vote
An approach using generating functions is possible.
Define $Y_k = X_k/2^k$, $k = 1, 2, ldots$. Then $$M_{Y_k}(t) = operatorname{E}[e^{tY_k}] = e^0 Pr[Y_k = 0] + e^{t/2^k} Pr[Y_k = 2^{-k}] = frac{e^{t/2^k} + 1}{2}.$$ Then defining $$S_n = sum_{k=1}^n Y_k,$$ we find that the MGF for $S_n$ is $$begin{align*} M_{S_n}(t) &= operatorname{E}left[prod_{k=1}^n e^{tY_k}right] = prod_{k=1}^n M_{Y_k}(t) \ &= 2^{-n} prod_{k=1}^n (e^{t/2^k} + 1) \ &= frac{2^{-n}}{e^{t/2^n} - 1} (e^{t/2^n} - 1)(e^{t/2^n} + 1) prod_{k=1}^{n-1} (e^{t/2^k} + 1) \ &= frac{2^{-n}}{e^{t/2^n} - 1} (e^{t/2^{n-1}} - 1) prod_{k=1}^{n-1} (e^{t/2^k} + 1) \ &= frac{e^t - 1}{2^n(e^{t/2^n} - 1)} .end{align*}$$ Taking the limit as $n to infty$, we easily get $$M_{S_infty}(t) = frac{e^t-1}{t},$$ which is the MGF of a $operatorname{Uniform}(0,1)$ distribution.
Great answer, and the proof can also be generalized for arbitrary base $b$. Do you know any source I can link to in my paper?
– tonytonov
Feb 25 '16 at 9:26
add a comment |
up vote
7
down vote
An approach using generating functions is possible.
Define $Y_k = X_k/2^k$, $k = 1, 2, ldots$. Then $$M_{Y_k}(t) = operatorname{E}[e^{tY_k}] = e^0 Pr[Y_k = 0] + e^{t/2^k} Pr[Y_k = 2^{-k}] = frac{e^{t/2^k} + 1}{2}.$$ Then defining $$S_n = sum_{k=1}^n Y_k,$$ we find that the MGF for $S_n$ is $$begin{align*} M_{S_n}(t) &= operatorname{E}left[prod_{k=1}^n e^{tY_k}right] = prod_{k=1}^n M_{Y_k}(t) \ &= 2^{-n} prod_{k=1}^n (e^{t/2^k} + 1) \ &= frac{2^{-n}}{e^{t/2^n} - 1} (e^{t/2^n} - 1)(e^{t/2^n} + 1) prod_{k=1}^{n-1} (e^{t/2^k} + 1) \ &= frac{2^{-n}}{e^{t/2^n} - 1} (e^{t/2^{n-1}} - 1) prod_{k=1}^{n-1} (e^{t/2^k} + 1) \ &= frac{e^t - 1}{2^n(e^{t/2^n} - 1)} .end{align*}$$ Taking the limit as $n to infty$, we easily get $$M_{S_infty}(t) = frac{e^t-1}{t},$$ which is the MGF of a $operatorname{Uniform}(0,1)$ distribution.
Great answer, and the proof can also be generalized for arbitrary base $b$. Do you know any source I can link to in my paper?
– tonytonov
Feb 25 '16 at 9:26
add a comment |
up vote
7
down vote
up vote
7
down vote
An approach using generating functions is possible.
Define $Y_k = X_k/2^k$, $k = 1, 2, ldots$. Then $$M_{Y_k}(t) = operatorname{E}[e^{tY_k}] = e^0 Pr[Y_k = 0] + e^{t/2^k} Pr[Y_k = 2^{-k}] = frac{e^{t/2^k} + 1}{2}.$$ Then defining $$S_n = sum_{k=1}^n Y_k,$$ we find that the MGF for $S_n$ is $$begin{align*} M_{S_n}(t) &= operatorname{E}left[prod_{k=1}^n e^{tY_k}right] = prod_{k=1}^n M_{Y_k}(t) \ &= 2^{-n} prod_{k=1}^n (e^{t/2^k} + 1) \ &= frac{2^{-n}}{e^{t/2^n} - 1} (e^{t/2^n} - 1)(e^{t/2^n} + 1) prod_{k=1}^{n-1} (e^{t/2^k} + 1) \ &= frac{2^{-n}}{e^{t/2^n} - 1} (e^{t/2^{n-1}} - 1) prod_{k=1}^{n-1} (e^{t/2^k} + 1) \ &= frac{e^t - 1}{2^n(e^{t/2^n} - 1)} .end{align*}$$ Taking the limit as $n to infty$, we easily get $$M_{S_infty}(t) = frac{e^t-1}{t},$$ which is the MGF of a $operatorname{Uniform}(0,1)$ distribution.
An approach using generating functions is possible.
Define $Y_k = X_k/2^k$, $k = 1, 2, ldots$. Then $$M_{Y_k}(t) = operatorname{E}[e^{tY_k}] = e^0 Pr[Y_k = 0] + e^{t/2^k} Pr[Y_k = 2^{-k}] = frac{e^{t/2^k} + 1}{2}.$$ Then defining $$S_n = sum_{k=1}^n Y_k,$$ we find that the MGF for $S_n$ is $$begin{align*} M_{S_n}(t) &= operatorname{E}left[prod_{k=1}^n e^{tY_k}right] = prod_{k=1}^n M_{Y_k}(t) \ &= 2^{-n} prod_{k=1}^n (e^{t/2^k} + 1) \ &= frac{2^{-n}}{e^{t/2^n} - 1} (e^{t/2^n} - 1)(e^{t/2^n} + 1) prod_{k=1}^{n-1} (e^{t/2^k} + 1) \ &= frac{2^{-n}}{e^{t/2^n} - 1} (e^{t/2^{n-1}} - 1) prod_{k=1}^{n-1} (e^{t/2^k} + 1) \ &= frac{e^t - 1}{2^n(e^{t/2^n} - 1)} .end{align*}$$ Taking the limit as $n to infty$, we easily get $$M_{S_infty}(t) = frac{e^t-1}{t},$$ which is the MGF of a $operatorname{Uniform}(0,1)$ distribution.
answered May 6 '15 at 0:52
heropup
62.3k65998
62.3k65998
Great answer, and the proof can also be generalized for arbitrary base $b$. Do you know any source I can link to in my paper?
– tonytonov
Feb 25 '16 at 9:26
add a comment |
Great answer, and the proof can also be generalized for arbitrary base $b$. Do you know any source I can link to in my paper?
– tonytonov
Feb 25 '16 at 9:26
Great answer, and the proof can also be generalized for arbitrary base $b$. Do you know any source I can link to in my paper?
– tonytonov
Feb 25 '16 at 9:26
Great answer, and the proof can also be generalized for arbitrary base $b$. Do you know any source I can link to in my paper?
– tonytonov
Feb 25 '16 at 9:26
add a comment |
up vote
2
down vote
Let $$S_n = sum_{k=1}^n 2^{-k}X_k.$$ As $$0leqslant S_nleqslant sum_{k=1}^n 2^{-k} = 1$$ almost surely for all $n$, this inequality holds in the limit, so by dominated convergence,
$$
begin{align*}
mathbb Eleft[lim_{ntoinfty}S_nright] &= lim_{ntoinfty}mathbb E[S_n]\
&= lim_{ntoinfty}mathbb Eleft[ sum_{k=1}^n 2^{-k}X_kright]\
&= lim_{ntoinfty}sum_{k=1}^n 2^{-k}mathbb E[X_k]\
&= lim_{ntoinfty}sum_{k=1}^n 2^{-k+1}\
&= frac12sum_{k=1}^infty 2^{-k}\
&= frac12.
end{align*}
$$
However, the limiting distribution is actually continuous. For each $n$, $S_n$ is uniformly distributed over
$$E_n=left{sum_{kin S}2^{-k} : Ssubset{1,2,ldots,n}right}. $$
Given an $n$, and some $omegain E_n$, we have for $mgeqslant n$
$$mathbb P(S_m = omega)=2^{-m}.$$
Hence
$$lim_{mtoinfty} P(S_m = omega)=0.$$
@gmath suggests in this answer that the limiting distribution is actually uniform on the interval $(0,1)$.
add a comment |
up vote
2
down vote
Let $$S_n = sum_{k=1}^n 2^{-k}X_k.$$ As $$0leqslant S_nleqslant sum_{k=1}^n 2^{-k} = 1$$ almost surely for all $n$, this inequality holds in the limit, so by dominated convergence,
$$
begin{align*}
mathbb Eleft[lim_{ntoinfty}S_nright] &= lim_{ntoinfty}mathbb E[S_n]\
&= lim_{ntoinfty}mathbb Eleft[ sum_{k=1}^n 2^{-k}X_kright]\
&= lim_{ntoinfty}sum_{k=1}^n 2^{-k}mathbb E[X_k]\
&= lim_{ntoinfty}sum_{k=1}^n 2^{-k+1}\
&= frac12sum_{k=1}^infty 2^{-k}\
&= frac12.
end{align*}
$$
However, the limiting distribution is actually continuous. For each $n$, $S_n$ is uniformly distributed over
$$E_n=left{sum_{kin S}2^{-k} : Ssubset{1,2,ldots,n}right}. $$
Given an $n$, and some $omegain E_n$, we have for $mgeqslant n$
$$mathbb P(S_m = omega)=2^{-m}.$$
Hence
$$lim_{mtoinfty} P(S_m = omega)=0.$$
@gmath suggests in this answer that the limiting distribution is actually uniform on the interval $(0,1)$.
add a comment |
up vote
2
down vote
up vote
2
down vote
Let $$S_n = sum_{k=1}^n 2^{-k}X_k.$$ As $$0leqslant S_nleqslant sum_{k=1}^n 2^{-k} = 1$$ almost surely for all $n$, this inequality holds in the limit, so by dominated convergence,
$$
begin{align*}
mathbb Eleft[lim_{ntoinfty}S_nright] &= lim_{ntoinfty}mathbb E[S_n]\
&= lim_{ntoinfty}mathbb Eleft[ sum_{k=1}^n 2^{-k}X_kright]\
&= lim_{ntoinfty}sum_{k=1}^n 2^{-k}mathbb E[X_k]\
&= lim_{ntoinfty}sum_{k=1}^n 2^{-k+1}\
&= frac12sum_{k=1}^infty 2^{-k}\
&= frac12.
end{align*}
$$
However, the limiting distribution is actually continuous. For each $n$, $S_n$ is uniformly distributed over
$$E_n=left{sum_{kin S}2^{-k} : Ssubset{1,2,ldots,n}right}. $$
Given an $n$, and some $omegain E_n$, we have for $mgeqslant n$
$$mathbb P(S_m = omega)=2^{-m}.$$
Hence
$$lim_{mtoinfty} P(S_m = omega)=0.$$
@gmath suggests in this answer that the limiting distribution is actually uniform on the interval $(0,1)$.
Let $$S_n = sum_{k=1}^n 2^{-k}X_k.$$ As $$0leqslant S_nleqslant sum_{k=1}^n 2^{-k} = 1$$ almost surely for all $n$, this inequality holds in the limit, so by dominated convergence,
$$
begin{align*}
mathbb Eleft[lim_{ntoinfty}S_nright] &= lim_{ntoinfty}mathbb E[S_n]\
&= lim_{ntoinfty}mathbb Eleft[ sum_{k=1}^n 2^{-k}X_kright]\
&= lim_{ntoinfty}sum_{k=1}^n 2^{-k}mathbb E[X_k]\
&= lim_{ntoinfty}sum_{k=1}^n 2^{-k+1}\
&= frac12sum_{k=1}^infty 2^{-k}\
&= frac12.
end{align*}
$$
However, the limiting distribution is actually continuous. For each $n$, $S_n$ is uniformly distributed over
$$E_n=left{sum_{kin S}2^{-k} : Ssubset{1,2,ldots,n}right}. $$
Given an $n$, and some $omegain E_n$, we have for $mgeqslant n$
$$mathbb P(S_m = omega)=2^{-m}.$$
Hence
$$lim_{mtoinfty} P(S_m = omega)=0.$$
@gmath suggests in this answer that the limiting distribution is actually uniform on the interval $(0,1)$.
edited Apr 13 '17 at 12:19
Community♦
1
1
answered May 5 '15 at 23:24
Math1000
18.8k31645
18.8k31645
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1268881%2flimit-distribution-of-infinite-sum-of-bernoulli-random-variables%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Are the random variables $x_k$ supposed to be independent?
– r.e.s.
May 6 '15 at 0:00
1
It should be noted that the limit is suggestively written in the form $0.x_1x_2x_3...$ as the base-$2$ representation of a random real number in the unit interval.
– r.e.s.
May 6 '15 at 0:05
1
A related question is "If $x$ is uniformly distributed on the unit interval, then what is the distribution of the binary digits of $x$?"
– r.e.s.
May 6 '15 at 0:14