Conditional Expectation of Uniform-Product












1












$begingroup$


So say I have $U_1, U_2, cdots$ be iid $U[0,1]$, and say $X_n = prod_1^n U_i$, if $N$ = inf{$n: X_n < 10^{-6}$}, I need to find two expectations



1) $E[N]$



2)$E[X_N]$



My initial thought was convert the uniform random variables to exponentials by taking the -log i.e. $-ln(U_i) = W_i$, thus we have $Y_n = -ln(X_n) = sum_1^n W_i$ which transforms N to
$$N = inf{n: Y_n > 6ln(10)}$$
which kind of screams coupon collector problem. But now the probabilities of the sampling time is not uniform any more so I am not sure on how to actually calculate the expectation.



Any hints/help on finding the first expectation would be greatly appreciated, in addition any hint on even starting the second expectation would also be phenomenal










share|cite|improve this question











$endgroup$

















    1












    $begingroup$


    So say I have $U_1, U_2, cdots$ be iid $U[0,1]$, and say $X_n = prod_1^n U_i$, if $N$ = inf{$n: X_n < 10^{-6}$}, I need to find two expectations



    1) $E[N]$



    2)$E[X_N]$



    My initial thought was convert the uniform random variables to exponentials by taking the -log i.e. $-ln(U_i) = W_i$, thus we have $Y_n = -ln(X_n) = sum_1^n W_i$ which transforms N to
    $$N = inf{n: Y_n > 6ln(10)}$$
    which kind of screams coupon collector problem. But now the probabilities of the sampling time is not uniform any more so I am not sure on how to actually calculate the expectation.



    Any hints/help on finding the first expectation would be greatly appreciated, in addition any hint on even starting the second expectation would also be phenomenal










    share|cite|improve this question











    $endgroup$















      1












      1








      1


      1



      $begingroup$


      So say I have $U_1, U_2, cdots$ be iid $U[0,1]$, and say $X_n = prod_1^n U_i$, if $N$ = inf{$n: X_n < 10^{-6}$}, I need to find two expectations



      1) $E[N]$



      2)$E[X_N]$



      My initial thought was convert the uniform random variables to exponentials by taking the -log i.e. $-ln(U_i) = W_i$, thus we have $Y_n = -ln(X_n) = sum_1^n W_i$ which transforms N to
      $$N = inf{n: Y_n > 6ln(10)}$$
      which kind of screams coupon collector problem. But now the probabilities of the sampling time is not uniform any more so I am not sure on how to actually calculate the expectation.



      Any hints/help on finding the first expectation would be greatly appreciated, in addition any hint on even starting the second expectation would also be phenomenal










      share|cite|improve this question











      $endgroup$




      So say I have $U_1, U_2, cdots$ be iid $U[0,1]$, and say $X_n = prod_1^n U_i$, if $N$ = inf{$n: X_n < 10^{-6}$}, I need to find two expectations



      1) $E[N]$



      2)$E[X_N]$



      My initial thought was convert the uniform random variables to exponentials by taking the -log i.e. $-ln(U_i) = W_i$, thus we have $Y_n = -ln(X_n) = sum_1^n W_i$ which transforms N to
      $$N = inf{n: Y_n > 6ln(10)}$$
      which kind of screams coupon collector problem. But now the probabilities of the sampling time is not uniform any more so I am not sure on how to actually calculate the expectation.



      Any hints/help on finding the first expectation would be greatly appreciated, in addition any hint on even starting the second expectation would also be phenomenal







      combinatorics probability-distributions conditional-expectation






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Dec 21 '18 at 9:15









      Lee David Chung Lin

      4,29031241




      4,29031241










      asked Oct 26 '18 at 19:18









      ManikSinManikSin

      577




      577






















          1 Answer
          1






          active

          oldest

          votes


















          3












          $begingroup$

          Actually you were very close to finishing the first question.



          Each $W_i = -ln U_i$ that follows the standard exponential distribution can be viewed as the waiting time between consecutive occurrences of a simple Poisson process with rate of unity.



          In other words, the sum $Y_n = sum_{i = 1}^n W_i$ that follows the Gamma distribution with shape parameter $n$ is the waiting time for the $n$-th occurrence of the Poisson process with unity rate.



          You have a cutoff time length of $t = 6 ln 10$. Consider a random variable $N_t$ that follows the discrete Poisson distribution with the probability mass function of



          $$P(N_t = k) = e^{-lambda t} frac{ (lambda t)^k }{ k! }~, quad text{where}~ lambda = 1,~t = 6 ln 10$$



          That is, $N_t$ with the parameter $lambda t$ counts the number of total occurrences up of a time interval of length $t$ of a Poisson process of rate $lambda$.



          Now, $N$ is actually related to the random variable $N_{t}$ in that $N = N_t + 1$



          Formally, we have the equivalence between the events ${ Y_n > t} = { N_t < n}$ for the waiting time of Poisson process and the count. Therefore we have the following equivalence of events:



          begin{alignat}{2}
          { N = 1 } &= { Y_1 > t } = { N_t < 1 } &&= { N_t = 0} \
          { N = 2 } &= { Y_1 < t ~~& ~~ Y_2 > t } &&= { N_t = 1} \
          { N = 3 } &= { Y_2 < t ~~& ~~ Y_3 > t } &&= { N_t = 2} \
          &qquadqquad vdots\
          { N = k } &= { Y_{k-1} < t ~~& ~~ Y_k > t } &&= { N_t = k-1 } \
          &qquadqquad vdots
          end{alignat}



          Thus the answer is simply given by one plus the expectation of a discrete Poisson distribution $E[N] = 1+E[N_t] = 1 + lambda t = 1 + t = 1 + 6 ln 10 approx 14.81551$





          For the second question $E[X_N]$, it is tempting to consider it as $E[ e^{-Y_N} ] $ and do the decomposition conditioning on $N$.



          However, that leads to a summation of a region integral over the region $Y_{N-1} + W_N > t$ that is unnecessarily complicating things.



          Since it all ends up the same, one might as well directly do the decomposition on the distribution of product of uniform, which derivation stems from the same approach of your logarithm transformation.



          With the density for unconditional $X_n$ shown in the linked pages, note that we will be working with $X_{k-1}$ so the power and the factorial are of $(k - 2)$. The case of $k = 1$ will automatically work out fine with some implicit understanding of the functional form.



          Consider a genera threshold $0 < tau < 1$ (such that $t = -ln tau$) and not just for $tau = 10^{-6}$ here:
          begin{align}
          Ebigl[ X_N bigr] &= sum_{k = 1}^{infty} Ebigl[ X_N ~big|~ N = k bigr] cdot P(N = k) \
          &= sum_{k = 1}^{infty} Ebigl[ X_{k-1} cdot U_k ~big|~ X_{k-1} > tau ~~& ~~ U_k < frac{ tau }{ X_{k-1} } bigr] cdot P( N_t = k - 1) \
          &= sum_{k = 1}^{infty} left{ int_{x = tau }^1int_{u = 0}^{ frac{ tau }x } x u cdot frac1{ (k-2)! }(-log x)^{k-2} cdot mathbb{1}_{0<u<1}, mathrm{d}u , mathrm{d}x right } \
          &= sum_{k = 1}^{infty} left{ int_{x =tau }^1 x cdot frac1{ (k - 2)! }(-log x)^{k-2} left[ int_{u = 0}^{ frac{ tau }x } u , mathrm{d}u right ] , mathrm{d}x right }
          end{align}

          In other words, the conditional expectation is in fact the unconditional expectation on the density $f_{X_{k-1}, U_k}$ (which is a direct product between $f_{X_{k-1}}$ and $f_{U_k}$ since they are independent) over the region $Omega: { (x,u)~|~ tau < x < 1, , 0 < u < frac{ tau }x }$. The probability $P(N_t = k-1)$ is exactly the mass over region $Omega$ thus "cancels with the conditioning mass".



          (Surely one can arrive directly at the double integral expression and skip the whole "conditioning". It is after all just a decomposition into disjoint events ${N = k}$)



          The inner integral is trivial and we have
          begin{align}
          Ebigl[ X_N bigr] &= sum_{k = 1}^{infty} int_{x = tau}^1 x cdot frac1{ (k - 2)! }(-log x)^{k-2} cdot frac12 bigl( frac{tau}x bigr)^2 , mathrm{d}x \
          &= frac{ tau^2 }2 sum_{k = 1}^{infty} int_{x = tau }^1 frac1x frac1{ (k - 2)! }(-log x)^{k-2} , mathrm{d}x \
          &= frac{ tau^2 }2 sum_{k = 1}^{infty} frac{ (-ln tau)^{k-1} }{ (k - 1)! } \
          &= frac{ tau^2 }2 e^{ -ln tau } \
          &= frac{ tau }2
          end{align}

          This nice result is not a coincidence. In fact, it's easy to prove that $X_N$ has a uniform distribution over $[0, tau]$.



          Again, we do the decomposition of all $X_N$ into disjoint ${ N = k }$. For $0 < s < tau$, consider the CDF as the probability:



          begin{align}
          Prbigl{ X_N < mathbf{color{magenta}s} bigr} &= sum_{k = 1}^{infty} left{ int_{x = tau }^1int_{u = 0}^{ mathbf{color{magenta}s}/ x } frac1{ (k-2)! }(-log x)^{k-2} cdot mathbb{1}_{0<u<1}, mathrm{d}u , mathrm{d}x right } \
          &= sum_{k = 1}^{infty} left{ int_{x =tau }^1 frac1{ (k - 2)! }(-log x)^{k-2} left[ int_{u = 0}^{ mathbf{ color{magenta}s}/x } 1 , mathrm{d}u right ] , mathrm{d}x right } \
          &= mathbf{color{magenta}s} sum_{k = 1}^{infty} int_{x =tau }^1 frac1{ (k - 2)! }(-log x)^{k-2} cdot frac1x, mathrm{d}x \
          &= mathbf{color{magenta}s} sum_{k = 1}^{infty} frac{ (-ln tau)^{k-1} }{ (k-1)! } \
          &= mathbf{color{magenta}s} cdot e^{ -ln tau } \
          &= frac{ mathbf{color{magenta}s} }{ tau } qquad text{with an implicity}~~mathbb{1}_{0 < mathbf{color{magenta}s} < tau}
          end{align}

          That is, the CDF $F_{X_N}(s) = s / tau$ is linear and the density is constant over $[0, tau]$, the uniform distribution as we know it.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Wow, that was an amazing answer, that clarified alot of my doubts!!!
            $endgroup$
            – ManikSin
            Oct 31 '18 at 14:56











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2972605%2fconditional-expectation-of-uniform-product%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          3












          $begingroup$

          Actually you were very close to finishing the first question.



          Each $W_i = -ln U_i$ that follows the standard exponential distribution can be viewed as the waiting time between consecutive occurrences of a simple Poisson process with rate of unity.



          In other words, the sum $Y_n = sum_{i = 1}^n W_i$ that follows the Gamma distribution with shape parameter $n$ is the waiting time for the $n$-th occurrence of the Poisson process with unity rate.



          You have a cutoff time length of $t = 6 ln 10$. Consider a random variable $N_t$ that follows the discrete Poisson distribution with the probability mass function of



          $$P(N_t = k) = e^{-lambda t} frac{ (lambda t)^k }{ k! }~, quad text{where}~ lambda = 1,~t = 6 ln 10$$



          That is, $N_t$ with the parameter $lambda t$ counts the number of total occurrences up of a time interval of length $t$ of a Poisson process of rate $lambda$.



          Now, $N$ is actually related to the random variable $N_{t}$ in that $N = N_t + 1$



          Formally, we have the equivalence between the events ${ Y_n > t} = { N_t < n}$ for the waiting time of Poisson process and the count. Therefore we have the following equivalence of events:



          begin{alignat}{2}
          { N = 1 } &= { Y_1 > t } = { N_t < 1 } &&= { N_t = 0} \
          { N = 2 } &= { Y_1 < t ~~& ~~ Y_2 > t } &&= { N_t = 1} \
          { N = 3 } &= { Y_2 < t ~~& ~~ Y_3 > t } &&= { N_t = 2} \
          &qquadqquad vdots\
          { N = k } &= { Y_{k-1} < t ~~& ~~ Y_k > t } &&= { N_t = k-1 } \
          &qquadqquad vdots
          end{alignat}



          Thus the answer is simply given by one plus the expectation of a discrete Poisson distribution $E[N] = 1+E[N_t] = 1 + lambda t = 1 + t = 1 + 6 ln 10 approx 14.81551$





          For the second question $E[X_N]$, it is tempting to consider it as $E[ e^{-Y_N} ] $ and do the decomposition conditioning on $N$.



          However, that leads to a summation of a region integral over the region $Y_{N-1} + W_N > t$ that is unnecessarily complicating things.



          Since it all ends up the same, one might as well directly do the decomposition on the distribution of product of uniform, which derivation stems from the same approach of your logarithm transformation.



          With the density for unconditional $X_n$ shown in the linked pages, note that we will be working with $X_{k-1}$ so the power and the factorial are of $(k - 2)$. The case of $k = 1$ will automatically work out fine with some implicit understanding of the functional form.



          Consider a genera threshold $0 < tau < 1$ (such that $t = -ln tau$) and not just for $tau = 10^{-6}$ here:
          begin{align}
          Ebigl[ X_N bigr] &= sum_{k = 1}^{infty} Ebigl[ X_N ~big|~ N = k bigr] cdot P(N = k) \
          &= sum_{k = 1}^{infty} Ebigl[ X_{k-1} cdot U_k ~big|~ X_{k-1} > tau ~~& ~~ U_k < frac{ tau }{ X_{k-1} } bigr] cdot P( N_t = k - 1) \
          &= sum_{k = 1}^{infty} left{ int_{x = tau }^1int_{u = 0}^{ frac{ tau }x } x u cdot frac1{ (k-2)! }(-log x)^{k-2} cdot mathbb{1}_{0<u<1}, mathrm{d}u , mathrm{d}x right } \
          &= sum_{k = 1}^{infty} left{ int_{x =tau }^1 x cdot frac1{ (k - 2)! }(-log x)^{k-2} left[ int_{u = 0}^{ frac{ tau }x } u , mathrm{d}u right ] , mathrm{d}x right }
          end{align}

          In other words, the conditional expectation is in fact the unconditional expectation on the density $f_{X_{k-1}, U_k}$ (which is a direct product between $f_{X_{k-1}}$ and $f_{U_k}$ since they are independent) over the region $Omega: { (x,u)~|~ tau < x < 1, , 0 < u < frac{ tau }x }$. The probability $P(N_t = k-1)$ is exactly the mass over region $Omega$ thus "cancels with the conditioning mass".



          (Surely one can arrive directly at the double integral expression and skip the whole "conditioning". It is after all just a decomposition into disjoint events ${N = k}$)



          The inner integral is trivial and we have
          begin{align}
          Ebigl[ X_N bigr] &= sum_{k = 1}^{infty} int_{x = tau}^1 x cdot frac1{ (k - 2)! }(-log x)^{k-2} cdot frac12 bigl( frac{tau}x bigr)^2 , mathrm{d}x \
          &= frac{ tau^2 }2 sum_{k = 1}^{infty} int_{x = tau }^1 frac1x frac1{ (k - 2)! }(-log x)^{k-2} , mathrm{d}x \
          &= frac{ tau^2 }2 sum_{k = 1}^{infty} frac{ (-ln tau)^{k-1} }{ (k - 1)! } \
          &= frac{ tau^2 }2 e^{ -ln tau } \
          &= frac{ tau }2
          end{align}

          This nice result is not a coincidence. In fact, it's easy to prove that $X_N$ has a uniform distribution over $[0, tau]$.



          Again, we do the decomposition of all $X_N$ into disjoint ${ N = k }$. For $0 < s < tau$, consider the CDF as the probability:



          begin{align}
          Prbigl{ X_N < mathbf{color{magenta}s} bigr} &= sum_{k = 1}^{infty} left{ int_{x = tau }^1int_{u = 0}^{ mathbf{color{magenta}s}/ x } frac1{ (k-2)! }(-log x)^{k-2} cdot mathbb{1}_{0<u<1}, mathrm{d}u , mathrm{d}x right } \
          &= sum_{k = 1}^{infty} left{ int_{x =tau }^1 frac1{ (k - 2)! }(-log x)^{k-2} left[ int_{u = 0}^{ mathbf{ color{magenta}s}/x } 1 , mathrm{d}u right ] , mathrm{d}x right } \
          &= mathbf{color{magenta}s} sum_{k = 1}^{infty} int_{x =tau }^1 frac1{ (k - 2)! }(-log x)^{k-2} cdot frac1x, mathrm{d}x \
          &= mathbf{color{magenta}s} sum_{k = 1}^{infty} frac{ (-ln tau)^{k-1} }{ (k-1)! } \
          &= mathbf{color{magenta}s} cdot e^{ -ln tau } \
          &= frac{ mathbf{color{magenta}s} }{ tau } qquad text{with an implicity}~~mathbb{1}_{0 < mathbf{color{magenta}s} < tau}
          end{align}

          That is, the CDF $F_{X_N}(s) = s / tau$ is linear and the density is constant over $[0, tau]$, the uniform distribution as we know it.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Wow, that was an amazing answer, that clarified alot of my doubts!!!
            $endgroup$
            – ManikSin
            Oct 31 '18 at 14:56
















          3












          $begingroup$

          Actually you were very close to finishing the first question.



          Each $W_i = -ln U_i$ that follows the standard exponential distribution can be viewed as the waiting time between consecutive occurrences of a simple Poisson process with rate of unity.



          In other words, the sum $Y_n = sum_{i = 1}^n W_i$ that follows the Gamma distribution with shape parameter $n$ is the waiting time for the $n$-th occurrence of the Poisson process with unity rate.



          You have a cutoff time length of $t = 6 ln 10$. Consider a random variable $N_t$ that follows the discrete Poisson distribution with the probability mass function of



          $$P(N_t = k) = e^{-lambda t} frac{ (lambda t)^k }{ k! }~, quad text{where}~ lambda = 1,~t = 6 ln 10$$



          That is, $N_t$ with the parameter $lambda t$ counts the number of total occurrences up of a time interval of length $t$ of a Poisson process of rate $lambda$.



          Now, $N$ is actually related to the random variable $N_{t}$ in that $N = N_t + 1$



          Formally, we have the equivalence between the events ${ Y_n > t} = { N_t < n}$ for the waiting time of Poisson process and the count. Therefore we have the following equivalence of events:



          begin{alignat}{2}
          { N = 1 } &= { Y_1 > t } = { N_t < 1 } &&= { N_t = 0} \
          { N = 2 } &= { Y_1 < t ~~& ~~ Y_2 > t } &&= { N_t = 1} \
          { N = 3 } &= { Y_2 < t ~~& ~~ Y_3 > t } &&= { N_t = 2} \
          &qquadqquad vdots\
          { N = k } &= { Y_{k-1} < t ~~& ~~ Y_k > t } &&= { N_t = k-1 } \
          &qquadqquad vdots
          end{alignat}



          Thus the answer is simply given by one plus the expectation of a discrete Poisson distribution $E[N] = 1+E[N_t] = 1 + lambda t = 1 + t = 1 + 6 ln 10 approx 14.81551$





          For the second question $E[X_N]$, it is tempting to consider it as $E[ e^{-Y_N} ] $ and do the decomposition conditioning on $N$.



          However, that leads to a summation of a region integral over the region $Y_{N-1} + W_N > t$ that is unnecessarily complicating things.



          Since it all ends up the same, one might as well directly do the decomposition on the distribution of product of uniform, which derivation stems from the same approach of your logarithm transformation.



          With the density for unconditional $X_n$ shown in the linked pages, note that we will be working with $X_{k-1}$ so the power and the factorial are of $(k - 2)$. The case of $k = 1$ will automatically work out fine with some implicit understanding of the functional form.



          Consider a genera threshold $0 < tau < 1$ (such that $t = -ln tau$) and not just for $tau = 10^{-6}$ here:
          begin{align}
          Ebigl[ X_N bigr] &= sum_{k = 1}^{infty} Ebigl[ X_N ~big|~ N = k bigr] cdot P(N = k) \
          &= sum_{k = 1}^{infty} Ebigl[ X_{k-1} cdot U_k ~big|~ X_{k-1} > tau ~~& ~~ U_k < frac{ tau }{ X_{k-1} } bigr] cdot P( N_t = k - 1) \
          &= sum_{k = 1}^{infty} left{ int_{x = tau }^1int_{u = 0}^{ frac{ tau }x } x u cdot frac1{ (k-2)! }(-log x)^{k-2} cdot mathbb{1}_{0<u<1}, mathrm{d}u , mathrm{d}x right } \
          &= sum_{k = 1}^{infty} left{ int_{x =tau }^1 x cdot frac1{ (k - 2)! }(-log x)^{k-2} left[ int_{u = 0}^{ frac{ tau }x } u , mathrm{d}u right ] , mathrm{d}x right }
          end{align}

          In other words, the conditional expectation is in fact the unconditional expectation on the density $f_{X_{k-1}, U_k}$ (which is a direct product between $f_{X_{k-1}}$ and $f_{U_k}$ since they are independent) over the region $Omega: { (x,u)~|~ tau < x < 1, , 0 < u < frac{ tau }x }$. The probability $P(N_t = k-1)$ is exactly the mass over region $Omega$ thus "cancels with the conditioning mass".



          (Surely one can arrive directly at the double integral expression and skip the whole "conditioning". It is after all just a decomposition into disjoint events ${N = k}$)



          The inner integral is trivial and we have
          begin{align}
          Ebigl[ X_N bigr] &= sum_{k = 1}^{infty} int_{x = tau}^1 x cdot frac1{ (k - 2)! }(-log x)^{k-2} cdot frac12 bigl( frac{tau}x bigr)^2 , mathrm{d}x \
          &= frac{ tau^2 }2 sum_{k = 1}^{infty} int_{x = tau }^1 frac1x frac1{ (k - 2)! }(-log x)^{k-2} , mathrm{d}x \
          &= frac{ tau^2 }2 sum_{k = 1}^{infty} frac{ (-ln tau)^{k-1} }{ (k - 1)! } \
          &= frac{ tau^2 }2 e^{ -ln tau } \
          &= frac{ tau }2
          end{align}

          This nice result is not a coincidence. In fact, it's easy to prove that $X_N$ has a uniform distribution over $[0, tau]$.



          Again, we do the decomposition of all $X_N$ into disjoint ${ N = k }$. For $0 < s < tau$, consider the CDF as the probability:



          begin{align}
          Prbigl{ X_N < mathbf{color{magenta}s} bigr} &= sum_{k = 1}^{infty} left{ int_{x = tau }^1int_{u = 0}^{ mathbf{color{magenta}s}/ x } frac1{ (k-2)! }(-log x)^{k-2} cdot mathbb{1}_{0<u<1}, mathrm{d}u , mathrm{d}x right } \
          &= sum_{k = 1}^{infty} left{ int_{x =tau }^1 frac1{ (k - 2)! }(-log x)^{k-2} left[ int_{u = 0}^{ mathbf{ color{magenta}s}/x } 1 , mathrm{d}u right ] , mathrm{d}x right } \
          &= mathbf{color{magenta}s} sum_{k = 1}^{infty} int_{x =tau }^1 frac1{ (k - 2)! }(-log x)^{k-2} cdot frac1x, mathrm{d}x \
          &= mathbf{color{magenta}s} sum_{k = 1}^{infty} frac{ (-ln tau)^{k-1} }{ (k-1)! } \
          &= mathbf{color{magenta}s} cdot e^{ -ln tau } \
          &= frac{ mathbf{color{magenta}s} }{ tau } qquad text{with an implicity}~~mathbb{1}_{0 < mathbf{color{magenta}s} < tau}
          end{align}

          That is, the CDF $F_{X_N}(s) = s / tau$ is linear and the density is constant over $[0, tau]$, the uniform distribution as we know it.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Wow, that was an amazing answer, that clarified alot of my doubts!!!
            $endgroup$
            – ManikSin
            Oct 31 '18 at 14:56














          3












          3








          3





          $begingroup$

          Actually you were very close to finishing the first question.



          Each $W_i = -ln U_i$ that follows the standard exponential distribution can be viewed as the waiting time between consecutive occurrences of a simple Poisson process with rate of unity.



          In other words, the sum $Y_n = sum_{i = 1}^n W_i$ that follows the Gamma distribution with shape parameter $n$ is the waiting time for the $n$-th occurrence of the Poisson process with unity rate.



          You have a cutoff time length of $t = 6 ln 10$. Consider a random variable $N_t$ that follows the discrete Poisson distribution with the probability mass function of



          $$P(N_t = k) = e^{-lambda t} frac{ (lambda t)^k }{ k! }~, quad text{where}~ lambda = 1,~t = 6 ln 10$$



          That is, $N_t$ with the parameter $lambda t$ counts the number of total occurrences up of a time interval of length $t$ of a Poisson process of rate $lambda$.



          Now, $N$ is actually related to the random variable $N_{t}$ in that $N = N_t + 1$



          Formally, we have the equivalence between the events ${ Y_n > t} = { N_t < n}$ for the waiting time of Poisson process and the count. Therefore we have the following equivalence of events:



          begin{alignat}{2}
          { N = 1 } &= { Y_1 > t } = { N_t < 1 } &&= { N_t = 0} \
          { N = 2 } &= { Y_1 < t ~~& ~~ Y_2 > t } &&= { N_t = 1} \
          { N = 3 } &= { Y_2 < t ~~& ~~ Y_3 > t } &&= { N_t = 2} \
          &qquadqquad vdots\
          { N = k } &= { Y_{k-1} < t ~~& ~~ Y_k > t } &&= { N_t = k-1 } \
          &qquadqquad vdots
          end{alignat}



          Thus the answer is simply given by one plus the expectation of a discrete Poisson distribution $E[N] = 1+E[N_t] = 1 + lambda t = 1 + t = 1 + 6 ln 10 approx 14.81551$





          For the second question $E[X_N]$, it is tempting to consider it as $E[ e^{-Y_N} ] $ and do the decomposition conditioning on $N$.



          However, that leads to a summation of a region integral over the region $Y_{N-1} + W_N > t$ that is unnecessarily complicating things.



          Since it all ends up the same, one might as well directly do the decomposition on the distribution of product of uniform, which derivation stems from the same approach of your logarithm transformation.



          With the density for unconditional $X_n$ shown in the linked pages, note that we will be working with $X_{k-1}$ so the power and the factorial are of $(k - 2)$. The case of $k = 1$ will automatically work out fine with some implicit understanding of the functional form.



          Consider a genera threshold $0 < tau < 1$ (such that $t = -ln tau$) and not just for $tau = 10^{-6}$ here:
          begin{align}
          Ebigl[ X_N bigr] &= sum_{k = 1}^{infty} Ebigl[ X_N ~big|~ N = k bigr] cdot P(N = k) \
          &= sum_{k = 1}^{infty} Ebigl[ X_{k-1} cdot U_k ~big|~ X_{k-1} > tau ~~& ~~ U_k < frac{ tau }{ X_{k-1} } bigr] cdot P( N_t = k - 1) \
          &= sum_{k = 1}^{infty} left{ int_{x = tau }^1int_{u = 0}^{ frac{ tau }x } x u cdot frac1{ (k-2)! }(-log x)^{k-2} cdot mathbb{1}_{0<u<1}, mathrm{d}u , mathrm{d}x right } \
          &= sum_{k = 1}^{infty} left{ int_{x =tau }^1 x cdot frac1{ (k - 2)! }(-log x)^{k-2} left[ int_{u = 0}^{ frac{ tau }x } u , mathrm{d}u right ] , mathrm{d}x right }
          end{align}

          In other words, the conditional expectation is in fact the unconditional expectation on the density $f_{X_{k-1}, U_k}$ (which is a direct product between $f_{X_{k-1}}$ and $f_{U_k}$ since they are independent) over the region $Omega: { (x,u)~|~ tau < x < 1, , 0 < u < frac{ tau }x }$. The probability $P(N_t = k-1)$ is exactly the mass over region $Omega$ thus "cancels with the conditioning mass".



          (Surely one can arrive directly at the double integral expression and skip the whole "conditioning". It is after all just a decomposition into disjoint events ${N = k}$)



          The inner integral is trivial and we have
          begin{align}
          Ebigl[ X_N bigr] &= sum_{k = 1}^{infty} int_{x = tau}^1 x cdot frac1{ (k - 2)! }(-log x)^{k-2} cdot frac12 bigl( frac{tau}x bigr)^2 , mathrm{d}x \
          &= frac{ tau^2 }2 sum_{k = 1}^{infty} int_{x = tau }^1 frac1x frac1{ (k - 2)! }(-log x)^{k-2} , mathrm{d}x \
          &= frac{ tau^2 }2 sum_{k = 1}^{infty} frac{ (-ln tau)^{k-1} }{ (k - 1)! } \
          &= frac{ tau^2 }2 e^{ -ln tau } \
          &= frac{ tau }2
          end{align}

          This nice result is not a coincidence. In fact, it's easy to prove that $X_N$ has a uniform distribution over $[0, tau]$.



          Again, we do the decomposition of all $X_N$ into disjoint ${ N = k }$. For $0 < s < tau$, consider the CDF as the probability:



          begin{align}
          Prbigl{ X_N < mathbf{color{magenta}s} bigr} &= sum_{k = 1}^{infty} left{ int_{x = tau }^1int_{u = 0}^{ mathbf{color{magenta}s}/ x } frac1{ (k-2)! }(-log x)^{k-2} cdot mathbb{1}_{0<u<1}, mathrm{d}u , mathrm{d}x right } \
          &= sum_{k = 1}^{infty} left{ int_{x =tau }^1 frac1{ (k - 2)! }(-log x)^{k-2} left[ int_{u = 0}^{ mathbf{ color{magenta}s}/x } 1 , mathrm{d}u right ] , mathrm{d}x right } \
          &= mathbf{color{magenta}s} sum_{k = 1}^{infty} int_{x =tau }^1 frac1{ (k - 2)! }(-log x)^{k-2} cdot frac1x, mathrm{d}x \
          &= mathbf{color{magenta}s} sum_{k = 1}^{infty} frac{ (-ln tau)^{k-1} }{ (k-1)! } \
          &= mathbf{color{magenta}s} cdot e^{ -ln tau } \
          &= frac{ mathbf{color{magenta}s} }{ tau } qquad text{with an implicity}~~mathbb{1}_{0 < mathbf{color{magenta}s} < tau}
          end{align}

          That is, the CDF $F_{X_N}(s) = s / tau$ is linear and the density is constant over $[0, tau]$, the uniform distribution as we know it.






          share|cite|improve this answer











          $endgroup$



          Actually you were very close to finishing the first question.



          Each $W_i = -ln U_i$ that follows the standard exponential distribution can be viewed as the waiting time between consecutive occurrences of a simple Poisson process with rate of unity.



          In other words, the sum $Y_n = sum_{i = 1}^n W_i$ that follows the Gamma distribution with shape parameter $n$ is the waiting time for the $n$-th occurrence of the Poisson process with unity rate.



          You have a cutoff time length of $t = 6 ln 10$. Consider a random variable $N_t$ that follows the discrete Poisson distribution with the probability mass function of



          $$P(N_t = k) = e^{-lambda t} frac{ (lambda t)^k }{ k! }~, quad text{where}~ lambda = 1,~t = 6 ln 10$$



          That is, $N_t$ with the parameter $lambda t$ counts the number of total occurrences up of a time interval of length $t$ of a Poisson process of rate $lambda$.



          Now, $N$ is actually related to the random variable $N_{t}$ in that $N = N_t + 1$



          Formally, we have the equivalence between the events ${ Y_n > t} = { N_t < n}$ for the waiting time of Poisson process and the count. Therefore we have the following equivalence of events:



          begin{alignat}{2}
          { N = 1 } &= { Y_1 > t } = { N_t < 1 } &&= { N_t = 0} \
          { N = 2 } &= { Y_1 < t ~~& ~~ Y_2 > t } &&= { N_t = 1} \
          { N = 3 } &= { Y_2 < t ~~& ~~ Y_3 > t } &&= { N_t = 2} \
          &qquadqquad vdots\
          { N = k } &= { Y_{k-1} < t ~~& ~~ Y_k > t } &&= { N_t = k-1 } \
          &qquadqquad vdots
          end{alignat}



          Thus the answer is simply given by one plus the expectation of a discrete Poisson distribution $E[N] = 1+E[N_t] = 1 + lambda t = 1 + t = 1 + 6 ln 10 approx 14.81551$





          For the second question $E[X_N]$, it is tempting to consider it as $E[ e^{-Y_N} ] $ and do the decomposition conditioning on $N$.



          However, that leads to a summation of a region integral over the region $Y_{N-1} + W_N > t$ that is unnecessarily complicating things.



          Since it all ends up the same, one might as well directly do the decomposition on the distribution of product of uniform, which derivation stems from the same approach of your logarithm transformation.



          With the density for unconditional $X_n$ shown in the linked pages, note that we will be working with $X_{k-1}$ so the power and the factorial are of $(k - 2)$. The case of $k = 1$ will automatically work out fine with some implicit understanding of the functional form.



          Consider a genera threshold $0 < tau < 1$ (such that $t = -ln tau$) and not just for $tau = 10^{-6}$ here:
          begin{align}
          Ebigl[ X_N bigr] &= sum_{k = 1}^{infty} Ebigl[ X_N ~big|~ N = k bigr] cdot P(N = k) \
          &= sum_{k = 1}^{infty} Ebigl[ X_{k-1} cdot U_k ~big|~ X_{k-1} > tau ~~& ~~ U_k < frac{ tau }{ X_{k-1} } bigr] cdot P( N_t = k - 1) \
          &= sum_{k = 1}^{infty} left{ int_{x = tau }^1int_{u = 0}^{ frac{ tau }x } x u cdot frac1{ (k-2)! }(-log x)^{k-2} cdot mathbb{1}_{0<u<1}, mathrm{d}u , mathrm{d}x right } \
          &= sum_{k = 1}^{infty} left{ int_{x =tau }^1 x cdot frac1{ (k - 2)! }(-log x)^{k-2} left[ int_{u = 0}^{ frac{ tau }x } u , mathrm{d}u right ] , mathrm{d}x right }
          end{align}

          In other words, the conditional expectation is in fact the unconditional expectation on the density $f_{X_{k-1}, U_k}$ (which is a direct product between $f_{X_{k-1}}$ and $f_{U_k}$ since they are independent) over the region $Omega: { (x,u)~|~ tau < x < 1, , 0 < u < frac{ tau }x }$. The probability $P(N_t = k-1)$ is exactly the mass over region $Omega$ thus "cancels with the conditioning mass".



          (Surely one can arrive directly at the double integral expression and skip the whole "conditioning". It is after all just a decomposition into disjoint events ${N = k}$)



          The inner integral is trivial and we have
          begin{align}
          Ebigl[ X_N bigr] &= sum_{k = 1}^{infty} int_{x = tau}^1 x cdot frac1{ (k - 2)! }(-log x)^{k-2} cdot frac12 bigl( frac{tau}x bigr)^2 , mathrm{d}x \
          &= frac{ tau^2 }2 sum_{k = 1}^{infty} int_{x = tau }^1 frac1x frac1{ (k - 2)! }(-log x)^{k-2} , mathrm{d}x \
          &= frac{ tau^2 }2 sum_{k = 1}^{infty} frac{ (-ln tau)^{k-1} }{ (k - 1)! } \
          &= frac{ tau^2 }2 e^{ -ln tau } \
          &= frac{ tau }2
          end{align}

          This nice result is not a coincidence. In fact, it's easy to prove that $X_N$ has a uniform distribution over $[0, tau]$.



          Again, we do the decomposition of all $X_N$ into disjoint ${ N = k }$. For $0 < s < tau$, consider the CDF as the probability:



          begin{align}
          Prbigl{ X_N < mathbf{color{magenta}s} bigr} &= sum_{k = 1}^{infty} left{ int_{x = tau }^1int_{u = 0}^{ mathbf{color{magenta}s}/ x } frac1{ (k-2)! }(-log x)^{k-2} cdot mathbb{1}_{0<u<1}, mathrm{d}u , mathrm{d}x right } \
          &= sum_{k = 1}^{infty} left{ int_{x =tau }^1 frac1{ (k - 2)! }(-log x)^{k-2} left[ int_{u = 0}^{ mathbf{ color{magenta}s}/x } 1 , mathrm{d}u right ] , mathrm{d}x right } \
          &= mathbf{color{magenta}s} sum_{k = 1}^{infty} int_{x =tau }^1 frac1{ (k - 2)! }(-log x)^{k-2} cdot frac1x, mathrm{d}x \
          &= mathbf{color{magenta}s} sum_{k = 1}^{infty} frac{ (-ln tau)^{k-1} }{ (k-1)! } \
          &= mathbf{color{magenta}s} cdot e^{ -ln tau } \
          &= frac{ mathbf{color{magenta}s} }{ tau } qquad text{with an implicity}~~mathbb{1}_{0 < mathbf{color{magenta}s} < tau}
          end{align}

          That is, the CDF $F_{X_N}(s) = s / tau$ is linear and the density is constant over $[0, tau]$, the uniform distribution as we know it.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Oct 28 '18 at 9:49

























          answered Oct 26 '18 at 20:38









          Lee David Chung LinLee David Chung Lin

          4,29031241




          4,29031241












          • $begingroup$
            Wow, that was an amazing answer, that clarified alot of my doubts!!!
            $endgroup$
            – ManikSin
            Oct 31 '18 at 14:56


















          • $begingroup$
            Wow, that was an amazing answer, that clarified alot of my doubts!!!
            $endgroup$
            – ManikSin
            Oct 31 '18 at 14:56
















          $begingroup$
          Wow, that was an amazing answer, that clarified alot of my doubts!!!
          $endgroup$
          – ManikSin
          Oct 31 '18 at 14:56




          $begingroup$
          Wow, that was an amazing answer, that clarified alot of my doubts!!!
          $endgroup$
          – ManikSin
          Oct 31 '18 at 14:56


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2972605%2fconditional-expectation-of-uniform-product%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Ellipse (mathématiques)

          Quarter-circle Tiles

          Mont Emei