Is there any counterexample to show that $X,Y$ are two random variables and $E(Xmid Y)=E(X)$, but $X$ and $Y$...












5












$begingroup$


Is there any counterexample to show that $X,Y$ are two random variables and $E(Xmid Y)=E(X)$, but $X$ and $Y$ are not independent.



I already know that if $X,Y$ are independent, then $E(Xmid Y)=E(X)$.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Uncorrelated random varriables
    $endgroup$
    – Key Flex
    Jun 11 '18 at 0:59


















5












$begingroup$


Is there any counterexample to show that $X,Y$ are two random variables and $E(Xmid Y)=E(X)$, but $X$ and $Y$ are not independent.



I already know that if $X,Y$ are independent, then $E(Xmid Y)=E(X)$.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Uncorrelated random varriables
    $endgroup$
    – Key Flex
    Jun 11 '18 at 0:59
















5












5








5





$begingroup$


Is there any counterexample to show that $X,Y$ are two random variables and $E(Xmid Y)=E(X)$, but $X$ and $Y$ are not independent.



I already know that if $X,Y$ are independent, then $E(Xmid Y)=E(X)$.










share|cite|improve this question











$endgroup$




Is there any counterexample to show that $X,Y$ are two random variables and $E(Xmid Y)=E(X)$, but $X$ and $Y$ are not independent.



I already know that if $X,Y$ are independent, then $E(Xmid Y)=E(X)$.







probability probability-theory expectation conditional-expectation






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jun 11 '18 at 3:18









Michael Hardy

1




1










asked Jun 11 '18 at 0:53









NYRAHHHNYRAHHH

35629




35629












  • $begingroup$
    Uncorrelated random varriables
    $endgroup$
    – Key Flex
    Jun 11 '18 at 0:59




















  • $begingroup$
    Uncorrelated random varriables
    $endgroup$
    – Key Flex
    Jun 11 '18 at 0:59


















$begingroup$
Uncorrelated random varriables
$endgroup$
– Key Flex
Jun 11 '18 at 0:59






$begingroup$
Uncorrelated random varriables
$endgroup$
– Key Flex
Jun 11 '18 at 0:59












3 Answers
3






active

oldest

votes


















4












$begingroup$

begin{align}
X & = begin{cases} -1 \ phantom{-}0 & text{each with probability } 1/3 \ +1 end{cases} \[10pt]
Y & = X^2
end{align}
Then $operatorname E(Xmid Y=0) = 0$ and $operatorname E(Xmid Y=1) = 0,$ so $operatorname E(Xmid Y) = 0$ with probability $1.$ But $X$ and $Y$ are very far from independent.






share|cite|improve this answer











$endgroup$









  • 2




    $begingroup$
    This is also a simple example for 'uncorrelated does not imply independence'.
    $endgroup$
    – StubbornAtom
    Jun 11 '18 at 5:28



















2












$begingroup$

Consider a two-component mixture Gaussian model with
begin{align}
& X mid Y = 0 sim N(0, 1), \
& X mid Y = 1 sim N(0, 4), \
& P(Y = 0) = P(Y = 1) = 1/2.
end{align}



Clearly $E(X mid Y = 0) = E(X mid Y = 1) = E(X) = 0$. But $X$ and $Y$ are not independent. Because on one hand,
$$P(-1 leq X leq 1, Y = 0) = P(-1 leq X leq 1 mid Y = 0) P(Y = 0) = Phi(1) -
0.5.$$

On the other hand,
begin{align}
P(-1 leq X leq 1) & = P(-1 leq X leq 1 mid Y = 0)P(Y = 0) + P(-1 leq X leq 1 mid Y = 1)P(Y = 1) \
&= Phi(1) - 0.5 + Phi(0.5) - 0.5 = Phi(1) + Phi(0.5) - 1.
end{align}

Therefore
$$P(-1 leq X leq 1)P(Y = 0) = 0.5Phi(1) + 0.5Phi(0.5) - 0.5 neq P(-1 leq X leq 1, Y = 0),$$
i.e., $X$ and $Y$ are not independent.





Per Clement's request, let me add more details to show that the random variable $E(X mid Y)$ is actually the constant $0$. In fact, as a random variable,
$Z(omega) = E[X|Y](omega)$ are constants on the the sets ${omega: Y(omega) = 1}$ and ${omega: Y(omega) = 0}$ (this is a general result holds for any conditional expectation of form $E[X | Y]$, see for example, Theorem 9.1.2. of A Course in Probability Theory). As the union of these two sets is the whole sample space $Omega$, it follows that $E[X mid Y] equiv 0$.



To avoid above technicalities which needs advanced probability theory, you may construct a discrete counterexample using the same idea. How about a $2$-by-$4$ table? The columns are possible values of $X$ while the rows are possible values of $Y$.



begin{array}{c|c c c c}
& -1 & 1 & -2 & 2 \
hline
0 & 1/4 & 1/4 & 0 & 0 \
1 & 0 & 0 & 1/4 & 1/4
end{array}






share|cite|improve this answer











$endgroup$













  • $begingroup$
    $E[Xmid Y]$ is a random variable, not a value, here (I.e., $E[X|Y=0]$ is not the thing the OP is asking about, even in terms of "type").
    $endgroup$
    – Clement C.
    Jun 11 '18 at 1:21










  • $begingroup$
    Of course I know that. Under this model, $E(X | Y) = 0$ for all $omega$.
    $endgroup$
    – Zhanxiong
    Jun 11 '18 at 1:22










  • $begingroup$
    It may be worth writing it, then? Otherwise, it's not immediate the answer actually answers the question.
    $endgroup$
    – Clement C.
    Jun 11 '18 at 1:23










  • $begingroup$
    Thank you. ${}{}$
    $endgroup$
    – Clement C.
    Jun 11 '18 at 1:40



















1












$begingroup$

Let $X,Y$ have an uniform density over the diamond $(0,1),(1,0),(0,-1),(-1,0)$ - or over the unit circle.



Then $X,Y$ are not independent [*], but $E(X mid Y) = E(X )=0$



For another example, you could take the triangle $(-1,0),(0,1),(1,0)$. In this case $E(X mid Y) = E(X)$ but $E(Y mid X) ne E(Y)$



Notice BTW that $E(X mid Y) = E(X) implies E(X Y) = E(X)E(Y)$ (uncorrelated), but not the reverse.



[*] If the marginals have support over finite intervals, then the support of the joint density must be a rectangle for the variables to be independent. Put in other way, if for some point $f_{X,Y}(x_0,y_0)=0$ but $f_X(x_0)>0$ and $f_Y(y_0)>0$ then the variables cannot be independent.






share|cite|improve this answer











$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2815227%2fis-there-any-counterexample-to-show-that-x-y-are-two-random-variables-and-ex%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    4












    $begingroup$

    begin{align}
    X & = begin{cases} -1 \ phantom{-}0 & text{each with probability } 1/3 \ +1 end{cases} \[10pt]
    Y & = X^2
    end{align}
    Then $operatorname E(Xmid Y=0) = 0$ and $operatorname E(Xmid Y=1) = 0,$ so $operatorname E(Xmid Y) = 0$ with probability $1.$ But $X$ and $Y$ are very far from independent.






    share|cite|improve this answer











    $endgroup$









    • 2




      $begingroup$
      This is also a simple example for 'uncorrelated does not imply independence'.
      $endgroup$
      – StubbornAtom
      Jun 11 '18 at 5:28
















    4












    $begingroup$

    begin{align}
    X & = begin{cases} -1 \ phantom{-}0 & text{each with probability } 1/3 \ +1 end{cases} \[10pt]
    Y & = X^2
    end{align}
    Then $operatorname E(Xmid Y=0) = 0$ and $operatorname E(Xmid Y=1) = 0,$ so $operatorname E(Xmid Y) = 0$ with probability $1.$ But $X$ and $Y$ are very far from independent.






    share|cite|improve this answer











    $endgroup$









    • 2




      $begingroup$
      This is also a simple example for 'uncorrelated does not imply independence'.
      $endgroup$
      – StubbornAtom
      Jun 11 '18 at 5:28














    4












    4








    4





    $begingroup$

    begin{align}
    X & = begin{cases} -1 \ phantom{-}0 & text{each with probability } 1/3 \ +1 end{cases} \[10pt]
    Y & = X^2
    end{align}
    Then $operatorname E(Xmid Y=0) = 0$ and $operatorname E(Xmid Y=1) = 0,$ so $operatorname E(Xmid Y) = 0$ with probability $1.$ But $X$ and $Y$ are very far from independent.






    share|cite|improve this answer











    $endgroup$



    begin{align}
    X & = begin{cases} -1 \ phantom{-}0 & text{each with probability } 1/3 \ +1 end{cases} \[10pt]
    Y & = X^2
    end{align}
    Then $operatorname E(Xmid Y=0) = 0$ and $operatorname E(Xmid Y=1) = 0,$ so $operatorname E(Xmid Y) = 0$ with probability $1.$ But $X$ and $Y$ are very far from independent.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Jun 11 '18 at 5:26









    StubbornAtom

    5,62411138




    5,62411138










    answered Jun 11 '18 at 3:24









    Michael HardyMichael Hardy

    1




    1








    • 2




      $begingroup$
      This is also a simple example for 'uncorrelated does not imply independence'.
      $endgroup$
      – StubbornAtom
      Jun 11 '18 at 5:28














    • 2




      $begingroup$
      This is also a simple example for 'uncorrelated does not imply independence'.
      $endgroup$
      – StubbornAtom
      Jun 11 '18 at 5:28








    2




    2




    $begingroup$
    This is also a simple example for 'uncorrelated does not imply independence'.
    $endgroup$
    – StubbornAtom
    Jun 11 '18 at 5:28




    $begingroup$
    This is also a simple example for 'uncorrelated does not imply independence'.
    $endgroup$
    – StubbornAtom
    Jun 11 '18 at 5:28











    2












    $begingroup$

    Consider a two-component mixture Gaussian model with
    begin{align}
    & X mid Y = 0 sim N(0, 1), \
    & X mid Y = 1 sim N(0, 4), \
    & P(Y = 0) = P(Y = 1) = 1/2.
    end{align}



    Clearly $E(X mid Y = 0) = E(X mid Y = 1) = E(X) = 0$. But $X$ and $Y$ are not independent. Because on one hand,
    $$P(-1 leq X leq 1, Y = 0) = P(-1 leq X leq 1 mid Y = 0) P(Y = 0) = Phi(1) -
    0.5.$$

    On the other hand,
    begin{align}
    P(-1 leq X leq 1) & = P(-1 leq X leq 1 mid Y = 0)P(Y = 0) + P(-1 leq X leq 1 mid Y = 1)P(Y = 1) \
    &= Phi(1) - 0.5 + Phi(0.5) - 0.5 = Phi(1) + Phi(0.5) - 1.
    end{align}

    Therefore
    $$P(-1 leq X leq 1)P(Y = 0) = 0.5Phi(1) + 0.5Phi(0.5) - 0.5 neq P(-1 leq X leq 1, Y = 0),$$
    i.e., $X$ and $Y$ are not independent.





    Per Clement's request, let me add more details to show that the random variable $E(X mid Y)$ is actually the constant $0$. In fact, as a random variable,
    $Z(omega) = E[X|Y](omega)$ are constants on the the sets ${omega: Y(omega) = 1}$ and ${omega: Y(omega) = 0}$ (this is a general result holds for any conditional expectation of form $E[X | Y]$, see for example, Theorem 9.1.2. of A Course in Probability Theory). As the union of these two sets is the whole sample space $Omega$, it follows that $E[X mid Y] equiv 0$.



    To avoid above technicalities which needs advanced probability theory, you may construct a discrete counterexample using the same idea. How about a $2$-by-$4$ table? The columns are possible values of $X$ while the rows are possible values of $Y$.



    begin{array}{c|c c c c}
    & -1 & 1 & -2 & 2 \
    hline
    0 & 1/4 & 1/4 & 0 & 0 \
    1 & 0 & 0 & 1/4 & 1/4
    end{array}






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      $E[Xmid Y]$ is a random variable, not a value, here (I.e., $E[X|Y=0]$ is not the thing the OP is asking about, even in terms of "type").
      $endgroup$
      – Clement C.
      Jun 11 '18 at 1:21










    • $begingroup$
      Of course I know that. Under this model, $E(X | Y) = 0$ for all $omega$.
      $endgroup$
      – Zhanxiong
      Jun 11 '18 at 1:22










    • $begingroup$
      It may be worth writing it, then? Otherwise, it's not immediate the answer actually answers the question.
      $endgroup$
      – Clement C.
      Jun 11 '18 at 1:23










    • $begingroup$
      Thank you. ${}{}$
      $endgroup$
      – Clement C.
      Jun 11 '18 at 1:40
















    2












    $begingroup$

    Consider a two-component mixture Gaussian model with
    begin{align}
    & X mid Y = 0 sim N(0, 1), \
    & X mid Y = 1 sim N(0, 4), \
    & P(Y = 0) = P(Y = 1) = 1/2.
    end{align}



    Clearly $E(X mid Y = 0) = E(X mid Y = 1) = E(X) = 0$. But $X$ and $Y$ are not independent. Because on one hand,
    $$P(-1 leq X leq 1, Y = 0) = P(-1 leq X leq 1 mid Y = 0) P(Y = 0) = Phi(1) -
    0.5.$$

    On the other hand,
    begin{align}
    P(-1 leq X leq 1) & = P(-1 leq X leq 1 mid Y = 0)P(Y = 0) + P(-1 leq X leq 1 mid Y = 1)P(Y = 1) \
    &= Phi(1) - 0.5 + Phi(0.5) - 0.5 = Phi(1) + Phi(0.5) - 1.
    end{align}

    Therefore
    $$P(-1 leq X leq 1)P(Y = 0) = 0.5Phi(1) + 0.5Phi(0.5) - 0.5 neq P(-1 leq X leq 1, Y = 0),$$
    i.e., $X$ and $Y$ are not independent.





    Per Clement's request, let me add more details to show that the random variable $E(X mid Y)$ is actually the constant $0$. In fact, as a random variable,
    $Z(omega) = E[X|Y](omega)$ are constants on the the sets ${omega: Y(omega) = 1}$ and ${omega: Y(omega) = 0}$ (this is a general result holds for any conditional expectation of form $E[X | Y]$, see for example, Theorem 9.1.2. of A Course in Probability Theory). As the union of these two sets is the whole sample space $Omega$, it follows that $E[X mid Y] equiv 0$.



    To avoid above technicalities which needs advanced probability theory, you may construct a discrete counterexample using the same idea. How about a $2$-by-$4$ table? The columns are possible values of $X$ while the rows are possible values of $Y$.



    begin{array}{c|c c c c}
    & -1 & 1 & -2 & 2 \
    hline
    0 & 1/4 & 1/4 & 0 & 0 \
    1 & 0 & 0 & 1/4 & 1/4
    end{array}






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      $E[Xmid Y]$ is a random variable, not a value, here (I.e., $E[X|Y=0]$ is not the thing the OP is asking about, even in terms of "type").
      $endgroup$
      – Clement C.
      Jun 11 '18 at 1:21










    • $begingroup$
      Of course I know that. Under this model, $E(X | Y) = 0$ for all $omega$.
      $endgroup$
      – Zhanxiong
      Jun 11 '18 at 1:22










    • $begingroup$
      It may be worth writing it, then? Otherwise, it's not immediate the answer actually answers the question.
      $endgroup$
      – Clement C.
      Jun 11 '18 at 1:23










    • $begingroup$
      Thank you. ${}{}$
      $endgroup$
      – Clement C.
      Jun 11 '18 at 1:40














    2












    2








    2





    $begingroup$

    Consider a two-component mixture Gaussian model with
    begin{align}
    & X mid Y = 0 sim N(0, 1), \
    & X mid Y = 1 sim N(0, 4), \
    & P(Y = 0) = P(Y = 1) = 1/2.
    end{align}



    Clearly $E(X mid Y = 0) = E(X mid Y = 1) = E(X) = 0$. But $X$ and $Y$ are not independent. Because on one hand,
    $$P(-1 leq X leq 1, Y = 0) = P(-1 leq X leq 1 mid Y = 0) P(Y = 0) = Phi(1) -
    0.5.$$

    On the other hand,
    begin{align}
    P(-1 leq X leq 1) & = P(-1 leq X leq 1 mid Y = 0)P(Y = 0) + P(-1 leq X leq 1 mid Y = 1)P(Y = 1) \
    &= Phi(1) - 0.5 + Phi(0.5) - 0.5 = Phi(1) + Phi(0.5) - 1.
    end{align}

    Therefore
    $$P(-1 leq X leq 1)P(Y = 0) = 0.5Phi(1) + 0.5Phi(0.5) - 0.5 neq P(-1 leq X leq 1, Y = 0),$$
    i.e., $X$ and $Y$ are not independent.





    Per Clement's request, let me add more details to show that the random variable $E(X mid Y)$ is actually the constant $0$. In fact, as a random variable,
    $Z(omega) = E[X|Y](omega)$ are constants on the the sets ${omega: Y(omega) = 1}$ and ${omega: Y(omega) = 0}$ (this is a general result holds for any conditional expectation of form $E[X | Y]$, see for example, Theorem 9.1.2. of A Course in Probability Theory). As the union of these two sets is the whole sample space $Omega$, it follows that $E[X mid Y] equiv 0$.



    To avoid above technicalities which needs advanced probability theory, you may construct a discrete counterexample using the same idea. How about a $2$-by-$4$ table? The columns are possible values of $X$ while the rows are possible values of $Y$.



    begin{array}{c|c c c c}
    & -1 & 1 & -2 & 2 \
    hline
    0 & 1/4 & 1/4 & 0 & 0 \
    1 & 0 & 0 & 1/4 & 1/4
    end{array}






    share|cite|improve this answer











    $endgroup$



    Consider a two-component mixture Gaussian model with
    begin{align}
    & X mid Y = 0 sim N(0, 1), \
    & X mid Y = 1 sim N(0, 4), \
    & P(Y = 0) = P(Y = 1) = 1/2.
    end{align}



    Clearly $E(X mid Y = 0) = E(X mid Y = 1) = E(X) = 0$. But $X$ and $Y$ are not independent. Because on one hand,
    $$P(-1 leq X leq 1, Y = 0) = P(-1 leq X leq 1 mid Y = 0) P(Y = 0) = Phi(1) -
    0.5.$$

    On the other hand,
    begin{align}
    P(-1 leq X leq 1) & = P(-1 leq X leq 1 mid Y = 0)P(Y = 0) + P(-1 leq X leq 1 mid Y = 1)P(Y = 1) \
    &= Phi(1) - 0.5 + Phi(0.5) - 0.5 = Phi(1) + Phi(0.5) - 1.
    end{align}

    Therefore
    $$P(-1 leq X leq 1)P(Y = 0) = 0.5Phi(1) + 0.5Phi(0.5) - 0.5 neq P(-1 leq X leq 1, Y = 0),$$
    i.e., $X$ and $Y$ are not independent.





    Per Clement's request, let me add more details to show that the random variable $E(X mid Y)$ is actually the constant $0$. In fact, as a random variable,
    $Z(omega) = E[X|Y](omega)$ are constants on the the sets ${omega: Y(omega) = 1}$ and ${omega: Y(omega) = 0}$ (this is a general result holds for any conditional expectation of form $E[X | Y]$, see for example, Theorem 9.1.2. of A Course in Probability Theory). As the union of these two sets is the whole sample space $Omega$, it follows that $E[X mid Y] equiv 0$.



    To avoid above technicalities which needs advanced probability theory, you may construct a discrete counterexample using the same idea. How about a $2$-by-$4$ table? The columns are possible values of $X$ while the rows are possible values of $Y$.



    begin{array}{c|c c c c}
    & -1 & 1 & -2 & 2 \
    hline
    0 & 1/4 & 1/4 & 0 & 0 \
    1 & 0 & 0 & 1/4 & 1/4
    end{array}







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Dec 5 '18 at 13:54

























    answered Jun 11 '18 at 1:19









    ZhanxiongZhanxiong

    8,83611031




    8,83611031












    • $begingroup$
      $E[Xmid Y]$ is a random variable, not a value, here (I.e., $E[X|Y=0]$ is not the thing the OP is asking about, even in terms of "type").
      $endgroup$
      – Clement C.
      Jun 11 '18 at 1:21










    • $begingroup$
      Of course I know that. Under this model, $E(X | Y) = 0$ for all $omega$.
      $endgroup$
      – Zhanxiong
      Jun 11 '18 at 1:22










    • $begingroup$
      It may be worth writing it, then? Otherwise, it's not immediate the answer actually answers the question.
      $endgroup$
      – Clement C.
      Jun 11 '18 at 1:23










    • $begingroup$
      Thank you. ${}{}$
      $endgroup$
      – Clement C.
      Jun 11 '18 at 1:40


















    • $begingroup$
      $E[Xmid Y]$ is a random variable, not a value, here (I.e., $E[X|Y=0]$ is not the thing the OP is asking about, even in terms of "type").
      $endgroup$
      – Clement C.
      Jun 11 '18 at 1:21










    • $begingroup$
      Of course I know that. Under this model, $E(X | Y) = 0$ for all $omega$.
      $endgroup$
      – Zhanxiong
      Jun 11 '18 at 1:22










    • $begingroup$
      It may be worth writing it, then? Otherwise, it's not immediate the answer actually answers the question.
      $endgroup$
      – Clement C.
      Jun 11 '18 at 1:23










    • $begingroup$
      Thank you. ${}{}$
      $endgroup$
      – Clement C.
      Jun 11 '18 at 1:40
















    $begingroup$
    $E[Xmid Y]$ is a random variable, not a value, here (I.e., $E[X|Y=0]$ is not the thing the OP is asking about, even in terms of "type").
    $endgroup$
    – Clement C.
    Jun 11 '18 at 1:21




    $begingroup$
    $E[Xmid Y]$ is a random variable, not a value, here (I.e., $E[X|Y=0]$ is not the thing the OP is asking about, even in terms of "type").
    $endgroup$
    – Clement C.
    Jun 11 '18 at 1:21












    $begingroup$
    Of course I know that. Under this model, $E(X | Y) = 0$ for all $omega$.
    $endgroup$
    – Zhanxiong
    Jun 11 '18 at 1:22




    $begingroup$
    Of course I know that. Under this model, $E(X | Y) = 0$ for all $omega$.
    $endgroup$
    – Zhanxiong
    Jun 11 '18 at 1:22












    $begingroup$
    It may be worth writing it, then? Otherwise, it's not immediate the answer actually answers the question.
    $endgroup$
    – Clement C.
    Jun 11 '18 at 1:23




    $begingroup$
    It may be worth writing it, then? Otherwise, it's not immediate the answer actually answers the question.
    $endgroup$
    – Clement C.
    Jun 11 '18 at 1:23












    $begingroup$
    Thank you. ${}{}$
    $endgroup$
    – Clement C.
    Jun 11 '18 at 1:40




    $begingroup$
    Thank you. ${}{}$
    $endgroup$
    – Clement C.
    Jun 11 '18 at 1:40











    1












    $begingroup$

    Let $X,Y$ have an uniform density over the diamond $(0,1),(1,0),(0,-1),(-1,0)$ - or over the unit circle.



    Then $X,Y$ are not independent [*], but $E(X mid Y) = E(X )=0$



    For another example, you could take the triangle $(-1,0),(0,1),(1,0)$. In this case $E(X mid Y) = E(X)$ but $E(Y mid X) ne E(Y)$



    Notice BTW that $E(X mid Y) = E(X) implies E(X Y) = E(X)E(Y)$ (uncorrelated), but not the reverse.



    [*] If the marginals have support over finite intervals, then the support of the joint density must be a rectangle for the variables to be independent. Put in other way, if for some point $f_{X,Y}(x_0,y_0)=0$ but $f_X(x_0)>0$ and $f_Y(y_0)>0$ then the variables cannot be independent.






    share|cite|improve this answer











    $endgroup$


















      1












      $begingroup$

      Let $X,Y$ have an uniform density over the diamond $(0,1),(1,0),(0,-1),(-1,0)$ - or over the unit circle.



      Then $X,Y$ are not independent [*], but $E(X mid Y) = E(X )=0$



      For another example, you could take the triangle $(-1,0),(0,1),(1,0)$. In this case $E(X mid Y) = E(X)$ but $E(Y mid X) ne E(Y)$



      Notice BTW that $E(X mid Y) = E(X) implies E(X Y) = E(X)E(Y)$ (uncorrelated), but not the reverse.



      [*] If the marginals have support over finite intervals, then the support of the joint density must be a rectangle for the variables to be independent. Put in other way, if for some point $f_{X,Y}(x_0,y_0)=0$ but $f_X(x_0)>0$ and $f_Y(y_0)>0$ then the variables cannot be independent.






      share|cite|improve this answer











      $endgroup$
















        1












        1








        1





        $begingroup$

        Let $X,Y$ have an uniform density over the diamond $(0,1),(1,0),(0,-1),(-1,0)$ - or over the unit circle.



        Then $X,Y$ are not independent [*], but $E(X mid Y) = E(X )=0$



        For another example, you could take the triangle $(-1,0),(0,1),(1,0)$. In this case $E(X mid Y) = E(X)$ but $E(Y mid X) ne E(Y)$



        Notice BTW that $E(X mid Y) = E(X) implies E(X Y) = E(X)E(Y)$ (uncorrelated), but not the reverse.



        [*] If the marginals have support over finite intervals, then the support of the joint density must be a rectangle for the variables to be independent. Put in other way, if for some point $f_{X,Y}(x_0,y_0)=0$ but $f_X(x_0)>0$ and $f_Y(y_0)>0$ then the variables cannot be independent.






        share|cite|improve this answer











        $endgroup$



        Let $X,Y$ have an uniform density over the diamond $(0,1),(1,0),(0,-1),(-1,0)$ - or over the unit circle.



        Then $X,Y$ are not independent [*], but $E(X mid Y) = E(X )=0$



        For another example, you could take the triangle $(-1,0),(0,1),(1,0)$. In this case $E(X mid Y) = E(X)$ but $E(Y mid X) ne E(Y)$



        Notice BTW that $E(X mid Y) = E(X) implies E(X Y) = E(X)E(Y)$ (uncorrelated), but not the reverse.



        [*] If the marginals have support over finite intervals, then the support of the joint density must be a rectangle for the variables to be independent. Put in other way, if for some point $f_{X,Y}(x_0,y_0)=0$ but $f_X(x_0)>0$ and $f_Y(y_0)>0$ then the variables cannot be independent.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Jun 11 '18 at 3:19









        triple_sec

        15.8k21852




        15.8k21852










        answered Jun 11 '18 at 1:43









        leonbloyleonbloy

        40.6k645107




        40.6k645107






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2815227%2fis-there-any-counterexample-to-show-that-x-y-are-two-random-variables-and-ex%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Ellipse (mathématiques)

            Quarter-circle Tiles

            Mont Emei