Example of a maximum likelihood estimator that is not a sufficient statistic
$begingroup$
I am currently researching on providing some bounds on estimation using some information theoretic tools (I won't expend on that here for now, I may make a post about it later) and turns out that given a phenomenon $X$, an observation $Y$, then $hat{x}(Y)$, the maximum likelihood estimator of $X$ based on $Y$, may apparently not be a sufficient statistic and this is a something I would like to study, the answer to this post states that such an example exists when $Y$ consists of samples that are not i.i.d but fails to provide such an example and I haven't found anything about it. Have anyone seen something of the sort ?
statistics estimation maximum-likelihood data-sufficiency
$endgroup$
add a comment |
$begingroup$
I am currently researching on providing some bounds on estimation using some information theoretic tools (I won't expend on that here for now, I may make a post about it later) and turns out that given a phenomenon $X$, an observation $Y$, then $hat{x}(Y)$, the maximum likelihood estimator of $X$ based on $Y$, may apparently not be a sufficient statistic and this is a something I would like to study, the answer to this post states that such an example exists when $Y$ consists of samples that are not i.i.d but fails to provide such an example and I haven't found anything about it. Have anyone seen something of the sort ?
statistics estimation maximum-likelihood data-sufficiency
$endgroup$
add a comment |
$begingroup$
I am currently researching on providing some bounds on estimation using some information theoretic tools (I won't expend on that here for now, I may make a post about it later) and turns out that given a phenomenon $X$, an observation $Y$, then $hat{x}(Y)$, the maximum likelihood estimator of $X$ based on $Y$, may apparently not be a sufficient statistic and this is a something I would like to study, the answer to this post states that such an example exists when $Y$ consists of samples that are not i.i.d but fails to provide such an example and I haven't found anything about it. Have anyone seen something of the sort ?
statistics estimation maximum-likelihood data-sufficiency
$endgroup$
I am currently researching on providing some bounds on estimation using some information theoretic tools (I won't expend on that here for now, I may make a post about it later) and turns out that given a phenomenon $X$, an observation $Y$, then $hat{x}(Y)$, the maximum likelihood estimator of $X$ based on $Y$, may apparently not be a sufficient statistic and this is a something I would like to study, the answer to this post states that such an example exists when $Y$ consists of samples that are not i.i.d but fails to provide such an example and I haven't found anything about it. Have anyone seen something of the sort ?
statistics estimation maximum-likelihood data-sufficiency
statistics estimation maximum-likelihood data-sufficiency
asked Dec 20 '18 at 11:00
P. QuintonP. Quinton
1,8161213
1,8161213
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
If $T$ is a sufficient statistic for $theta$ and a unique MLE of $theta$ exists, then the MLE must be a function of $T$.
So if you can find a situation where there can be several maximum likelihood estimators, there remains a possibility that you can choose one MLE that might not be a function of a sufficient statistic alone.
A simple example to consider is the $U(theta,theta+1)$ distribution.
Consider i.i.d random variables $X_1,X_2,ldots,X_n$ having the above distribution.
Then the likelihood function given the sample $(x_1,ldots,x_n)$ is
$$L(theta)=prod_{i=1}^n mathbf1_{theta<x_i<theta+1}=mathbf1_{theta<x_{(1)},x_{(n)}<theta+1}quad,,thetainmathbb R$$
A sufficient statistic for $theta$ is $$T(X_1,ldots,X_n)=(X_{(1)},X_{(n)})$$
And an MLE of $theta$ is any $hattheta$ satisfying $$hattheta<X_{(1)},X_{(n)}<hattheta+1$$
or equivalently, $$X_{(n)}-1<hattheta<X_{(1)} tag{1}$$
Choose $$hattheta'= (sin^2 X_1)(X_{(n)}-1) + (cos^2 X_1)(X_{(1)})$$
Then $hattheta'$ satisfies $(1)$ but it does not depend on the sample only through $T$.
$endgroup$
$begingroup$
I am a bit curious, do you think there is always a element in the set of MLE that is a sufficient statistic ? (in this case yes, if we take $(X_{(1)}+X_{(n)})/2$) and is it unique ? Also I am interested in other measures of error, do you know if there are some examples of minimum MSE estimator that is not a sufficient statistic too ?
$endgroup$
– P. Quinton
Dec 20 '18 at 16:40
$begingroup$
@P.Quinton There is of course an MLE that is a function of $T$. Any MLE of the form $alpha(X_{(n)}-1)+(1-alpha)X_{(1)}$ is a function of the sufficient statistic $T$ where $alphain(0,1)$ is a constant.
$endgroup$
– StubbornAtom
Dec 20 '18 at 18:10
$begingroup$
Yes but that's not really what I asked, in my case I was wondering if $theta - hattheta - (X_1, dots, X_n)$ is a Markov chain (supposing that $theta$ is a random variable), I think that in your formula, if we know alpha then we can find $X_{(1)}$ and $X_{(n)}$ back, so it's a sufficient statistic, is that correct ?
$endgroup$
– P. Quinton
Dec 20 '18 at 18:16
$begingroup$
@P.Quinton Well that is out of my league to answer. I only aimed to give you an MLE that is not a function of sufficient statistic, as asked in your post. An MLE of the form $alpha(X_{(n)}-1)+(1-alpha)X_{(1)}$ satisfies condition $(1)$ in my answer. If we choose $alphain(0,1)$ to be a constant, then the MLE depends on the sample only through $T$.
$endgroup$
– StubbornAtom
Dec 20 '18 at 18:19
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3047416%2fexample-of-a-maximum-likelihood-estimator-that-is-not-a-sufficient-statistic%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
If $T$ is a sufficient statistic for $theta$ and a unique MLE of $theta$ exists, then the MLE must be a function of $T$.
So if you can find a situation where there can be several maximum likelihood estimators, there remains a possibility that you can choose one MLE that might not be a function of a sufficient statistic alone.
A simple example to consider is the $U(theta,theta+1)$ distribution.
Consider i.i.d random variables $X_1,X_2,ldots,X_n$ having the above distribution.
Then the likelihood function given the sample $(x_1,ldots,x_n)$ is
$$L(theta)=prod_{i=1}^n mathbf1_{theta<x_i<theta+1}=mathbf1_{theta<x_{(1)},x_{(n)}<theta+1}quad,,thetainmathbb R$$
A sufficient statistic for $theta$ is $$T(X_1,ldots,X_n)=(X_{(1)},X_{(n)})$$
And an MLE of $theta$ is any $hattheta$ satisfying $$hattheta<X_{(1)},X_{(n)}<hattheta+1$$
or equivalently, $$X_{(n)}-1<hattheta<X_{(1)} tag{1}$$
Choose $$hattheta'= (sin^2 X_1)(X_{(n)}-1) + (cos^2 X_1)(X_{(1)})$$
Then $hattheta'$ satisfies $(1)$ but it does not depend on the sample only through $T$.
$endgroup$
$begingroup$
I am a bit curious, do you think there is always a element in the set of MLE that is a sufficient statistic ? (in this case yes, if we take $(X_{(1)}+X_{(n)})/2$) and is it unique ? Also I am interested in other measures of error, do you know if there are some examples of minimum MSE estimator that is not a sufficient statistic too ?
$endgroup$
– P. Quinton
Dec 20 '18 at 16:40
$begingroup$
@P.Quinton There is of course an MLE that is a function of $T$. Any MLE of the form $alpha(X_{(n)}-1)+(1-alpha)X_{(1)}$ is a function of the sufficient statistic $T$ where $alphain(0,1)$ is a constant.
$endgroup$
– StubbornAtom
Dec 20 '18 at 18:10
$begingroup$
Yes but that's not really what I asked, in my case I was wondering if $theta - hattheta - (X_1, dots, X_n)$ is a Markov chain (supposing that $theta$ is a random variable), I think that in your formula, if we know alpha then we can find $X_{(1)}$ and $X_{(n)}$ back, so it's a sufficient statistic, is that correct ?
$endgroup$
– P. Quinton
Dec 20 '18 at 18:16
$begingroup$
@P.Quinton Well that is out of my league to answer. I only aimed to give you an MLE that is not a function of sufficient statistic, as asked in your post. An MLE of the form $alpha(X_{(n)}-1)+(1-alpha)X_{(1)}$ satisfies condition $(1)$ in my answer. If we choose $alphain(0,1)$ to be a constant, then the MLE depends on the sample only through $T$.
$endgroup$
– StubbornAtom
Dec 20 '18 at 18:19
add a comment |
$begingroup$
If $T$ is a sufficient statistic for $theta$ and a unique MLE of $theta$ exists, then the MLE must be a function of $T$.
So if you can find a situation where there can be several maximum likelihood estimators, there remains a possibility that you can choose one MLE that might not be a function of a sufficient statistic alone.
A simple example to consider is the $U(theta,theta+1)$ distribution.
Consider i.i.d random variables $X_1,X_2,ldots,X_n$ having the above distribution.
Then the likelihood function given the sample $(x_1,ldots,x_n)$ is
$$L(theta)=prod_{i=1}^n mathbf1_{theta<x_i<theta+1}=mathbf1_{theta<x_{(1)},x_{(n)}<theta+1}quad,,thetainmathbb R$$
A sufficient statistic for $theta$ is $$T(X_1,ldots,X_n)=(X_{(1)},X_{(n)})$$
And an MLE of $theta$ is any $hattheta$ satisfying $$hattheta<X_{(1)},X_{(n)}<hattheta+1$$
or equivalently, $$X_{(n)}-1<hattheta<X_{(1)} tag{1}$$
Choose $$hattheta'= (sin^2 X_1)(X_{(n)}-1) + (cos^2 X_1)(X_{(1)})$$
Then $hattheta'$ satisfies $(1)$ but it does not depend on the sample only through $T$.
$endgroup$
$begingroup$
I am a bit curious, do you think there is always a element in the set of MLE that is a sufficient statistic ? (in this case yes, if we take $(X_{(1)}+X_{(n)})/2$) and is it unique ? Also I am interested in other measures of error, do you know if there are some examples of minimum MSE estimator that is not a sufficient statistic too ?
$endgroup$
– P. Quinton
Dec 20 '18 at 16:40
$begingroup$
@P.Quinton There is of course an MLE that is a function of $T$. Any MLE of the form $alpha(X_{(n)}-1)+(1-alpha)X_{(1)}$ is a function of the sufficient statistic $T$ where $alphain(0,1)$ is a constant.
$endgroup$
– StubbornAtom
Dec 20 '18 at 18:10
$begingroup$
Yes but that's not really what I asked, in my case I was wondering if $theta - hattheta - (X_1, dots, X_n)$ is a Markov chain (supposing that $theta$ is a random variable), I think that in your formula, if we know alpha then we can find $X_{(1)}$ and $X_{(n)}$ back, so it's a sufficient statistic, is that correct ?
$endgroup$
– P. Quinton
Dec 20 '18 at 18:16
$begingroup$
@P.Quinton Well that is out of my league to answer. I only aimed to give you an MLE that is not a function of sufficient statistic, as asked in your post. An MLE of the form $alpha(X_{(n)}-1)+(1-alpha)X_{(1)}$ satisfies condition $(1)$ in my answer. If we choose $alphain(0,1)$ to be a constant, then the MLE depends on the sample only through $T$.
$endgroup$
– StubbornAtom
Dec 20 '18 at 18:19
add a comment |
$begingroup$
If $T$ is a sufficient statistic for $theta$ and a unique MLE of $theta$ exists, then the MLE must be a function of $T$.
So if you can find a situation where there can be several maximum likelihood estimators, there remains a possibility that you can choose one MLE that might not be a function of a sufficient statistic alone.
A simple example to consider is the $U(theta,theta+1)$ distribution.
Consider i.i.d random variables $X_1,X_2,ldots,X_n$ having the above distribution.
Then the likelihood function given the sample $(x_1,ldots,x_n)$ is
$$L(theta)=prod_{i=1}^n mathbf1_{theta<x_i<theta+1}=mathbf1_{theta<x_{(1)},x_{(n)}<theta+1}quad,,thetainmathbb R$$
A sufficient statistic for $theta$ is $$T(X_1,ldots,X_n)=(X_{(1)},X_{(n)})$$
And an MLE of $theta$ is any $hattheta$ satisfying $$hattheta<X_{(1)},X_{(n)}<hattheta+1$$
or equivalently, $$X_{(n)}-1<hattheta<X_{(1)} tag{1}$$
Choose $$hattheta'= (sin^2 X_1)(X_{(n)}-1) + (cos^2 X_1)(X_{(1)})$$
Then $hattheta'$ satisfies $(1)$ but it does not depend on the sample only through $T$.
$endgroup$
If $T$ is a sufficient statistic for $theta$ and a unique MLE of $theta$ exists, then the MLE must be a function of $T$.
So if you can find a situation where there can be several maximum likelihood estimators, there remains a possibility that you can choose one MLE that might not be a function of a sufficient statistic alone.
A simple example to consider is the $U(theta,theta+1)$ distribution.
Consider i.i.d random variables $X_1,X_2,ldots,X_n$ having the above distribution.
Then the likelihood function given the sample $(x_1,ldots,x_n)$ is
$$L(theta)=prod_{i=1}^n mathbf1_{theta<x_i<theta+1}=mathbf1_{theta<x_{(1)},x_{(n)}<theta+1}quad,,thetainmathbb R$$
A sufficient statistic for $theta$ is $$T(X_1,ldots,X_n)=(X_{(1)},X_{(n)})$$
And an MLE of $theta$ is any $hattheta$ satisfying $$hattheta<X_{(1)},X_{(n)}<hattheta+1$$
or equivalently, $$X_{(n)}-1<hattheta<X_{(1)} tag{1}$$
Choose $$hattheta'= (sin^2 X_1)(X_{(n)}-1) + (cos^2 X_1)(X_{(1)})$$
Then $hattheta'$ satisfies $(1)$ but it does not depend on the sample only through $T$.
edited Dec 20 '18 at 14:16
answered Dec 20 '18 at 13:56
StubbornAtomStubbornAtom
6,00811238
6,00811238
$begingroup$
I am a bit curious, do you think there is always a element in the set of MLE that is a sufficient statistic ? (in this case yes, if we take $(X_{(1)}+X_{(n)})/2$) and is it unique ? Also I am interested in other measures of error, do you know if there are some examples of minimum MSE estimator that is not a sufficient statistic too ?
$endgroup$
– P. Quinton
Dec 20 '18 at 16:40
$begingroup$
@P.Quinton There is of course an MLE that is a function of $T$. Any MLE of the form $alpha(X_{(n)}-1)+(1-alpha)X_{(1)}$ is a function of the sufficient statistic $T$ where $alphain(0,1)$ is a constant.
$endgroup$
– StubbornAtom
Dec 20 '18 at 18:10
$begingroup$
Yes but that's not really what I asked, in my case I was wondering if $theta - hattheta - (X_1, dots, X_n)$ is a Markov chain (supposing that $theta$ is a random variable), I think that in your formula, if we know alpha then we can find $X_{(1)}$ and $X_{(n)}$ back, so it's a sufficient statistic, is that correct ?
$endgroup$
– P. Quinton
Dec 20 '18 at 18:16
$begingroup$
@P.Quinton Well that is out of my league to answer. I only aimed to give you an MLE that is not a function of sufficient statistic, as asked in your post. An MLE of the form $alpha(X_{(n)}-1)+(1-alpha)X_{(1)}$ satisfies condition $(1)$ in my answer. If we choose $alphain(0,1)$ to be a constant, then the MLE depends on the sample only through $T$.
$endgroup$
– StubbornAtom
Dec 20 '18 at 18:19
add a comment |
$begingroup$
I am a bit curious, do you think there is always a element in the set of MLE that is a sufficient statistic ? (in this case yes, if we take $(X_{(1)}+X_{(n)})/2$) and is it unique ? Also I am interested in other measures of error, do you know if there are some examples of minimum MSE estimator that is not a sufficient statistic too ?
$endgroup$
– P. Quinton
Dec 20 '18 at 16:40
$begingroup$
@P.Quinton There is of course an MLE that is a function of $T$. Any MLE of the form $alpha(X_{(n)}-1)+(1-alpha)X_{(1)}$ is a function of the sufficient statistic $T$ where $alphain(0,1)$ is a constant.
$endgroup$
– StubbornAtom
Dec 20 '18 at 18:10
$begingroup$
Yes but that's not really what I asked, in my case I was wondering if $theta - hattheta - (X_1, dots, X_n)$ is a Markov chain (supposing that $theta$ is a random variable), I think that in your formula, if we know alpha then we can find $X_{(1)}$ and $X_{(n)}$ back, so it's a sufficient statistic, is that correct ?
$endgroup$
– P. Quinton
Dec 20 '18 at 18:16
$begingroup$
@P.Quinton Well that is out of my league to answer. I only aimed to give you an MLE that is not a function of sufficient statistic, as asked in your post. An MLE of the form $alpha(X_{(n)}-1)+(1-alpha)X_{(1)}$ satisfies condition $(1)$ in my answer. If we choose $alphain(0,1)$ to be a constant, then the MLE depends on the sample only through $T$.
$endgroup$
– StubbornAtom
Dec 20 '18 at 18:19
$begingroup$
I am a bit curious, do you think there is always a element in the set of MLE that is a sufficient statistic ? (in this case yes, if we take $(X_{(1)}+X_{(n)})/2$) and is it unique ? Also I am interested in other measures of error, do you know if there are some examples of minimum MSE estimator that is not a sufficient statistic too ?
$endgroup$
– P. Quinton
Dec 20 '18 at 16:40
$begingroup$
I am a bit curious, do you think there is always a element in the set of MLE that is a sufficient statistic ? (in this case yes, if we take $(X_{(1)}+X_{(n)})/2$) and is it unique ? Also I am interested in other measures of error, do you know if there are some examples of minimum MSE estimator that is not a sufficient statistic too ?
$endgroup$
– P. Quinton
Dec 20 '18 at 16:40
$begingroup$
@P.Quinton There is of course an MLE that is a function of $T$. Any MLE of the form $alpha(X_{(n)}-1)+(1-alpha)X_{(1)}$ is a function of the sufficient statistic $T$ where $alphain(0,1)$ is a constant.
$endgroup$
– StubbornAtom
Dec 20 '18 at 18:10
$begingroup$
@P.Quinton There is of course an MLE that is a function of $T$. Any MLE of the form $alpha(X_{(n)}-1)+(1-alpha)X_{(1)}$ is a function of the sufficient statistic $T$ where $alphain(0,1)$ is a constant.
$endgroup$
– StubbornAtom
Dec 20 '18 at 18:10
$begingroup$
Yes but that's not really what I asked, in my case I was wondering if $theta - hattheta - (X_1, dots, X_n)$ is a Markov chain (supposing that $theta$ is a random variable), I think that in your formula, if we know alpha then we can find $X_{(1)}$ and $X_{(n)}$ back, so it's a sufficient statistic, is that correct ?
$endgroup$
– P. Quinton
Dec 20 '18 at 18:16
$begingroup$
Yes but that's not really what I asked, in my case I was wondering if $theta - hattheta - (X_1, dots, X_n)$ is a Markov chain (supposing that $theta$ is a random variable), I think that in your formula, if we know alpha then we can find $X_{(1)}$ and $X_{(n)}$ back, so it's a sufficient statistic, is that correct ?
$endgroup$
– P. Quinton
Dec 20 '18 at 18:16
$begingroup$
@P.Quinton Well that is out of my league to answer. I only aimed to give you an MLE that is not a function of sufficient statistic, as asked in your post. An MLE of the form $alpha(X_{(n)}-1)+(1-alpha)X_{(1)}$ satisfies condition $(1)$ in my answer. If we choose $alphain(0,1)$ to be a constant, then the MLE depends on the sample only through $T$.
$endgroup$
– StubbornAtom
Dec 20 '18 at 18:19
$begingroup$
@P.Quinton Well that is out of my league to answer. I only aimed to give you an MLE that is not a function of sufficient statistic, as asked in your post. An MLE of the form $alpha(X_{(n)}-1)+(1-alpha)X_{(1)}$ satisfies condition $(1)$ in my answer. If we choose $alphain(0,1)$ to be a constant, then the MLE depends on the sample only through $T$.
$endgroup$
– StubbornAtom
Dec 20 '18 at 18:19
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3047416%2fexample-of-a-maximum-likelihood-estimator-that-is-not-a-sufficient-statistic%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown