Proving mean of sample minimum of U[0,1] is 1/(n+1) without calculus
$begingroup$
Let $U : mathbb{R} times mathbb{R} nrightarrow mathrm{dist}[mathbb{R}]$ denote the parametrized family of uniform distributions where $U(a, b)$ is the uniform distribution with minimum $a$ and maximum $b$ . $U$ is a partial function defined whenever $a lt b$ .
Let's define $Y_1, Y_2, dots Y_n$ as an $n$-element sample drawn from $U(0,1)$ . Let $Y_{(1)}$ denote the sample minimum. Let $f$ denote the pdf of the sample minimum of $n$ elements drawn from $U(0, 1)$ and let $F$ denote the corresponding cdf.
I want to demonstrate that its expected value is $frac{1}{n+1}$ in a way that's as simple as possible, ideally without using calculus.
$$ mathrm{E}[Y_{(1)}] = frac{1}{n+1} $$
Here's one way to do it, which does use calculus:
$$ mathrm{E}[Y_{(1)}] tag{1}$$
use known formula for expectation in terms of cdf
$$ int_{s=0}^infty 1-F(s) mathrm{d}s tag{2}$$
$F(s) = 1$ when $s ge 1$ .
$$ int_{s=0}^{1} 1 - F(s) mathrm{d}s tag{3} $$
$F(t)$ is the probability that the sample minimum is less than $t$.
$$ int_{s=0}^{1} 1 - mathbb{P}[Y_1 le s lor cdots lor Y_n le s] mathrm{d}s tag{4} $$
$1-mathbb{P}[psi]$ is $mathbb{P}[lnot psi]$ .
$$ int_{s=0}^{1} mathbb{P}[Y_1 ge s land cdots land Y_n ge s] mathrm{d}s tag{5} $$
replace integrand with product.
$$ int_{s=0}^{1} (1-s)^n mathrm{d}s tag{6} $$
change variable $ t = 1-s $ and swap bounds of integral.
$$ int_{t=0}^{1} t^n mathrm{d}t tag{7} $$
simplify
$$ frac{1}{n+1} tag{8} $$
ΟΕΔ
alternative-proof order-statistics
$endgroup$
add a comment |
$begingroup$
Let $U : mathbb{R} times mathbb{R} nrightarrow mathrm{dist}[mathbb{R}]$ denote the parametrized family of uniform distributions where $U(a, b)$ is the uniform distribution with minimum $a$ and maximum $b$ . $U$ is a partial function defined whenever $a lt b$ .
Let's define $Y_1, Y_2, dots Y_n$ as an $n$-element sample drawn from $U(0,1)$ . Let $Y_{(1)}$ denote the sample minimum. Let $f$ denote the pdf of the sample minimum of $n$ elements drawn from $U(0, 1)$ and let $F$ denote the corresponding cdf.
I want to demonstrate that its expected value is $frac{1}{n+1}$ in a way that's as simple as possible, ideally without using calculus.
$$ mathrm{E}[Y_{(1)}] = frac{1}{n+1} $$
Here's one way to do it, which does use calculus:
$$ mathrm{E}[Y_{(1)}] tag{1}$$
use known formula for expectation in terms of cdf
$$ int_{s=0}^infty 1-F(s) mathrm{d}s tag{2}$$
$F(s) = 1$ when $s ge 1$ .
$$ int_{s=0}^{1} 1 - F(s) mathrm{d}s tag{3} $$
$F(t)$ is the probability that the sample minimum is less than $t$.
$$ int_{s=0}^{1} 1 - mathbb{P}[Y_1 le s lor cdots lor Y_n le s] mathrm{d}s tag{4} $$
$1-mathbb{P}[psi]$ is $mathbb{P}[lnot psi]$ .
$$ int_{s=0}^{1} mathbb{P}[Y_1 ge s land cdots land Y_n ge s] mathrm{d}s tag{5} $$
replace integrand with product.
$$ int_{s=0}^{1} (1-s)^n mathrm{d}s tag{6} $$
change variable $ t = 1-s $ and swap bounds of integral.
$$ int_{t=0}^{1} t^n mathrm{d}t tag{7} $$
simplify
$$ frac{1}{n+1} tag{8} $$
ΟΕΔ
alternative-proof order-statistics
$endgroup$
add a comment |
$begingroup$
Let $U : mathbb{R} times mathbb{R} nrightarrow mathrm{dist}[mathbb{R}]$ denote the parametrized family of uniform distributions where $U(a, b)$ is the uniform distribution with minimum $a$ and maximum $b$ . $U$ is a partial function defined whenever $a lt b$ .
Let's define $Y_1, Y_2, dots Y_n$ as an $n$-element sample drawn from $U(0,1)$ . Let $Y_{(1)}$ denote the sample minimum. Let $f$ denote the pdf of the sample minimum of $n$ elements drawn from $U(0, 1)$ and let $F$ denote the corresponding cdf.
I want to demonstrate that its expected value is $frac{1}{n+1}$ in a way that's as simple as possible, ideally without using calculus.
$$ mathrm{E}[Y_{(1)}] = frac{1}{n+1} $$
Here's one way to do it, which does use calculus:
$$ mathrm{E}[Y_{(1)}] tag{1}$$
use known formula for expectation in terms of cdf
$$ int_{s=0}^infty 1-F(s) mathrm{d}s tag{2}$$
$F(s) = 1$ when $s ge 1$ .
$$ int_{s=0}^{1} 1 - F(s) mathrm{d}s tag{3} $$
$F(t)$ is the probability that the sample minimum is less than $t$.
$$ int_{s=0}^{1} 1 - mathbb{P}[Y_1 le s lor cdots lor Y_n le s] mathrm{d}s tag{4} $$
$1-mathbb{P}[psi]$ is $mathbb{P}[lnot psi]$ .
$$ int_{s=0}^{1} mathbb{P}[Y_1 ge s land cdots land Y_n ge s] mathrm{d}s tag{5} $$
replace integrand with product.
$$ int_{s=0}^{1} (1-s)^n mathrm{d}s tag{6} $$
change variable $ t = 1-s $ and swap bounds of integral.
$$ int_{t=0}^{1} t^n mathrm{d}t tag{7} $$
simplify
$$ frac{1}{n+1} tag{8} $$
ΟΕΔ
alternative-proof order-statistics
$endgroup$
Let $U : mathbb{R} times mathbb{R} nrightarrow mathrm{dist}[mathbb{R}]$ denote the parametrized family of uniform distributions where $U(a, b)$ is the uniform distribution with minimum $a$ and maximum $b$ . $U$ is a partial function defined whenever $a lt b$ .
Let's define $Y_1, Y_2, dots Y_n$ as an $n$-element sample drawn from $U(0,1)$ . Let $Y_{(1)}$ denote the sample minimum. Let $f$ denote the pdf of the sample minimum of $n$ elements drawn from $U(0, 1)$ and let $F$ denote the corresponding cdf.
I want to demonstrate that its expected value is $frac{1}{n+1}$ in a way that's as simple as possible, ideally without using calculus.
$$ mathrm{E}[Y_{(1)}] = frac{1}{n+1} $$
Here's one way to do it, which does use calculus:
$$ mathrm{E}[Y_{(1)}] tag{1}$$
use known formula for expectation in terms of cdf
$$ int_{s=0}^infty 1-F(s) mathrm{d}s tag{2}$$
$F(s) = 1$ when $s ge 1$ .
$$ int_{s=0}^{1} 1 - F(s) mathrm{d}s tag{3} $$
$F(t)$ is the probability that the sample minimum is less than $t$.
$$ int_{s=0}^{1} 1 - mathbb{P}[Y_1 le s lor cdots lor Y_n le s] mathrm{d}s tag{4} $$
$1-mathbb{P}[psi]$ is $mathbb{P}[lnot psi]$ .
$$ int_{s=0}^{1} mathbb{P}[Y_1 ge s land cdots land Y_n ge s] mathrm{d}s tag{5} $$
replace integrand with product.
$$ int_{s=0}^{1} (1-s)^n mathrm{d}s tag{6} $$
change variable $ t = 1-s $ and swap bounds of integral.
$$ int_{t=0}^{1} t^n mathrm{d}t tag{7} $$
simplify
$$ frac{1}{n+1} tag{8} $$
ΟΕΔ
alternative-proof order-statistics
alternative-proof order-statistics
edited Dec 25 '18 at 1:15
Gregory Nisbet
asked Dec 24 '18 at 1:49
Gregory NisbetGregory Nisbet
709512
709512
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Take a circle of length $1$, and choose $n+1$ points independently and uniformly on it. Cut the circle at the first point, and unwrap to the interval $[0,1]$; the remaining $n$ points are distributed uniformly and independently on that interval. Now, those points cut $[0,1]$ into $n+1$ subintervals, in order. What's the expected length of each of them? Well, pull back to the circle - there was nothing special about us cutting at the first point. Cut at one of the other points, and we cycle the subintervals around, while keeping the same picture of $n$ uniform points after the cut. Thus, the lengths of all $n+1$ subintervals are identically distributed. In particular, they have the same mean. As their sum is $1$, the mean of each must be $frac1{n+1}$ by linearity of expectation. Done.
This also works to calculate the expected value of all of the other order statistics; the $k$th smallest has expected value $frac{k}{n+1}$.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3050871%2fproving-mean-of-sample-minimum-of-u0-1-is-1-n1-without-calculus%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Take a circle of length $1$, and choose $n+1$ points independently and uniformly on it. Cut the circle at the first point, and unwrap to the interval $[0,1]$; the remaining $n$ points are distributed uniformly and independently on that interval. Now, those points cut $[0,1]$ into $n+1$ subintervals, in order. What's the expected length of each of them? Well, pull back to the circle - there was nothing special about us cutting at the first point. Cut at one of the other points, and we cycle the subintervals around, while keeping the same picture of $n$ uniform points after the cut. Thus, the lengths of all $n+1$ subintervals are identically distributed. In particular, they have the same mean. As their sum is $1$, the mean of each must be $frac1{n+1}$ by linearity of expectation. Done.
This also works to calculate the expected value of all of the other order statistics; the $k$th smallest has expected value $frac{k}{n+1}$.
$endgroup$
add a comment |
$begingroup$
Take a circle of length $1$, and choose $n+1$ points independently and uniformly on it. Cut the circle at the first point, and unwrap to the interval $[0,1]$; the remaining $n$ points are distributed uniformly and independently on that interval. Now, those points cut $[0,1]$ into $n+1$ subintervals, in order. What's the expected length of each of them? Well, pull back to the circle - there was nothing special about us cutting at the first point. Cut at one of the other points, and we cycle the subintervals around, while keeping the same picture of $n$ uniform points after the cut. Thus, the lengths of all $n+1$ subintervals are identically distributed. In particular, they have the same mean. As their sum is $1$, the mean of each must be $frac1{n+1}$ by linearity of expectation. Done.
This also works to calculate the expected value of all of the other order statistics; the $k$th smallest has expected value $frac{k}{n+1}$.
$endgroup$
add a comment |
$begingroup$
Take a circle of length $1$, and choose $n+1$ points independently and uniformly on it. Cut the circle at the first point, and unwrap to the interval $[0,1]$; the remaining $n$ points are distributed uniformly and independently on that interval. Now, those points cut $[0,1]$ into $n+1$ subintervals, in order. What's the expected length of each of them? Well, pull back to the circle - there was nothing special about us cutting at the first point. Cut at one of the other points, and we cycle the subintervals around, while keeping the same picture of $n$ uniform points after the cut. Thus, the lengths of all $n+1$ subintervals are identically distributed. In particular, they have the same mean. As their sum is $1$, the mean of each must be $frac1{n+1}$ by linearity of expectation. Done.
This also works to calculate the expected value of all of the other order statistics; the $k$th smallest has expected value $frac{k}{n+1}$.
$endgroup$
Take a circle of length $1$, and choose $n+1$ points independently and uniformly on it. Cut the circle at the first point, and unwrap to the interval $[0,1]$; the remaining $n$ points are distributed uniformly and independently on that interval. Now, those points cut $[0,1]$ into $n+1$ subintervals, in order. What's the expected length of each of them? Well, pull back to the circle - there was nothing special about us cutting at the first point. Cut at one of the other points, and we cycle the subintervals around, while keeping the same picture of $n$ uniform points after the cut. Thus, the lengths of all $n+1$ subintervals are identically distributed. In particular, they have the same mean. As their sum is $1$, the mean of each must be $frac1{n+1}$ by linearity of expectation. Done.
This also works to calculate the expected value of all of the other order statistics; the $k$th smallest has expected value $frac{k}{n+1}$.
answered Dec 24 '18 at 2:00
jmerryjmerry
9,7681225
9,7681225
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3050871%2fproving-mean-of-sample-minimum-of-u0-1-is-1-n1-without-calculus%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown