Understanding the definition of ergodicity through examples
$begingroup$
I am taking a course on Communication Systems (from an engineering point of view). While I'm usually very interested in the formal mathematics, this time I would like to avoid it, since I don't have a good background on probability theory; also, many of my colleagues have little to no interest in formal mathematics at all, and I would like to understand this concept in a way that would be easy to propagate to them.
So, apparently to understand the meaning of ergodicity, one needs to know what is the ensemble average and what is the time average of a random process. After reading this answer on Math.SE, and three related entries on Wikipedia (Ergodic Process, Ergodicity, Stationary ergodic process), this is what I understand:
A random process is like a random variable, but it's outcomes are "waveforms" (a.k.a. functions) instead of numbers.
The ensemble average is the average of the outcomes of the random process, and therefore is another function (waveform) by itself. A given random variable will have one ensemble average (one function).
As opposed to the ensemble average, a random process can have many (possibly infinitely many) time averages, since every outcome of the random process (i.e., every waveform) has its own time average, which is the average value of the waveform. That is, given an outcome $x(t)$, it's time average will be given by
$$lim_{T to infty} dfrac{1}{T} int_{-frac{T}{2}}^{frac{T}{2}}x(t)dt$$
After reading those wikipedia pages, it seems that an ergodic process is a process that satisfies "the time average is equal to the ensemble average". But, which time average? To my understanding there are many time averages. Which one? All of them? Their mean? At least one of them? Or what?
Also, I would like to take a better look on the following examples, found in the linked wikipedia pages:
Example 1.
Suppose that we have two coins: one coin is fair and the other has two heads. We choose (at random) one of the coins, and then perform a sequence of independent tosses of our selected coin. Let X[n] denote the outcome of the nth toss, with 1 for heads and 0 for tails. Then the ensemble average is ½ (½ + 1) = ¾; yet the long-term average is ½ for the fair coin and 1 for the two-headed coin. Hence, this random process is not ergodic in mean.
Is this correct? (of course my doubt here is a consequence of the fact that I don't know the answer for my bold question above).
Example 2.
Ergodicity is where the ensemble average equals the time average. Each resistor has thermal noise associated with it and it depends on the temperature. Take N resistors (N should be very large) and plot the voltage across those resistors for a long period. For each resistor you will have a waveform. Calculate the average value of that waveform. This gives you the time average. You should also note that you have N waveforms as we have N resistors. These N plots are known as an ensemble. Now take a particular instant of time in all those plots and find the average value of the voltage. That gives you the ensemble average for each plot. If both ensemble average and time average are the same then it is ergodic.
It says "take a particular instant of time in all those plots". Does any instant works? Or rather, do I have to take "all of them" (one at a time)? Taking only one instant of time doesn't seem right... Rather, shouldn't I be taking some sort of limit to infinity? Also, it refers to "the time average" as if it was only one, but to my understanding there are N different time averages here, since there are N waveforms.
probability-theory statistics stochastic-processes average
$endgroup$
add a comment |
$begingroup$
I am taking a course on Communication Systems (from an engineering point of view). While I'm usually very interested in the formal mathematics, this time I would like to avoid it, since I don't have a good background on probability theory; also, many of my colleagues have little to no interest in formal mathematics at all, and I would like to understand this concept in a way that would be easy to propagate to them.
So, apparently to understand the meaning of ergodicity, one needs to know what is the ensemble average and what is the time average of a random process. After reading this answer on Math.SE, and three related entries on Wikipedia (Ergodic Process, Ergodicity, Stationary ergodic process), this is what I understand:
A random process is like a random variable, but it's outcomes are "waveforms" (a.k.a. functions) instead of numbers.
The ensemble average is the average of the outcomes of the random process, and therefore is another function (waveform) by itself. A given random variable will have one ensemble average (one function).
As opposed to the ensemble average, a random process can have many (possibly infinitely many) time averages, since every outcome of the random process (i.e., every waveform) has its own time average, which is the average value of the waveform. That is, given an outcome $x(t)$, it's time average will be given by
$$lim_{T to infty} dfrac{1}{T} int_{-frac{T}{2}}^{frac{T}{2}}x(t)dt$$
After reading those wikipedia pages, it seems that an ergodic process is a process that satisfies "the time average is equal to the ensemble average". But, which time average? To my understanding there are many time averages. Which one? All of them? Their mean? At least one of them? Or what?
Also, I would like to take a better look on the following examples, found in the linked wikipedia pages:
Example 1.
Suppose that we have two coins: one coin is fair and the other has two heads. We choose (at random) one of the coins, and then perform a sequence of independent tosses of our selected coin. Let X[n] denote the outcome of the nth toss, with 1 for heads and 0 for tails. Then the ensemble average is ½ (½ + 1) = ¾; yet the long-term average is ½ for the fair coin and 1 for the two-headed coin. Hence, this random process is not ergodic in mean.
Is this correct? (of course my doubt here is a consequence of the fact that I don't know the answer for my bold question above).
Example 2.
Ergodicity is where the ensemble average equals the time average. Each resistor has thermal noise associated with it and it depends on the temperature. Take N resistors (N should be very large) and plot the voltage across those resistors for a long period. For each resistor you will have a waveform. Calculate the average value of that waveform. This gives you the time average. You should also note that you have N waveforms as we have N resistors. These N plots are known as an ensemble. Now take a particular instant of time in all those plots and find the average value of the voltage. That gives you the ensemble average for each plot. If both ensemble average and time average are the same then it is ergodic.
It says "take a particular instant of time in all those plots". Does any instant works? Or rather, do I have to take "all of them" (one at a time)? Taking only one instant of time doesn't seem right... Rather, shouldn't I be taking some sort of limit to infinity? Also, it refers to "the time average" as if it was only one, but to my understanding there are N different time averages here, since there are N waveforms.
probability-theory statistics stochastic-processes average
$endgroup$
add a comment |
$begingroup$
I am taking a course on Communication Systems (from an engineering point of view). While I'm usually very interested in the formal mathematics, this time I would like to avoid it, since I don't have a good background on probability theory; also, many of my colleagues have little to no interest in formal mathematics at all, and I would like to understand this concept in a way that would be easy to propagate to them.
So, apparently to understand the meaning of ergodicity, one needs to know what is the ensemble average and what is the time average of a random process. After reading this answer on Math.SE, and three related entries on Wikipedia (Ergodic Process, Ergodicity, Stationary ergodic process), this is what I understand:
A random process is like a random variable, but it's outcomes are "waveforms" (a.k.a. functions) instead of numbers.
The ensemble average is the average of the outcomes of the random process, and therefore is another function (waveform) by itself. A given random variable will have one ensemble average (one function).
As opposed to the ensemble average, a random process can have many (possibly infinitely many) time averages, since every outcome of the random process (i.e., every waveform) has its own time average, which is the average value of the waveform. That is, given an outcome $x(t)$, it's time average will be given by
$$lim_{T to infty} dfrac{1}{T} int_{-frac{T}{2}}^{frac{T}{2}}x(t)dt$$
After reading those wikipedia pages, it seems that an ergodic process is a process that satisfies "the time average is equal to the ensemble average". But, which time average? To my understanding there are many time averages. Which one? All of them? Their mean? At least one of them? Or what?
Also, I would like to take a better look on the following examples, found in the linked wikipedia pages:
Example 1.
Suppose that we have two coins: one coin is fair and the other has two heads. We choose (at random) one of the coins, and then perform a sequence of independent tosses of our selected coin. Let X[n] denote the outcome of the nth toss, with 1 for heads and 0 for tails. Then the ensemble average is ½ (½ + 1) = ¾; yet the long-term average is ½ for the fair coin and 1 for the two-headed coin. Hence, this random process is not ergodic in mean.
Is this correct? (of course my doubt here is a consequence of the fact that I don't know the answer for my bold question above).
Example 2.
Ergodicity is where the ensemble average equals the time average. Each resistor has thermal noise associated with it and it depends on the temperature. Take N resistors (N should be very large) and plot the voltage across those resistors for a long period. For each resistor you will have a waveform. Calculate the average value of that waveform. This gives you the time average. You should also note that you have N waveforms as we have N resistors. These N plots are known as an ensemble. Now take a particular instant of time in all those plots and find the average value of the voltage. That gives you the ensemble average for each plot. If both ensemble average and time average are the same then it is ergodic.
It says "take a particular instant of time in all those plots". Does any instant works? Or rather, do I have to take "all of them" (one at a time)? Taking only one instant of time doesn't seem right... Rather, shouldn't I be taking some sort of limit to infinity? Also, it refers to "the time average" as if it was only one, but to my understanding there are N different time averages here, since there are N waveforms.
probability-theory statistics stochastic-processes average
$endgroup$
I am taking a course on Communication Systems (from an engineering point of view). While I'm usually very interested in the formal mathematics, this time I would like to avoid it, since I don't have a good background on probability theory; also, many of my colleagues have little to no interest in formal mathematics at all, and I would like to understand this concept in a way that would be easy to propagate to them.
So, apparently to understand the meaning of ergodicity, one needs to know what is the ensemble average and what is the time average of a random process. After reading this answer on Math.SE, and three related entries on Wikipedia (Ergodic Process, Ergodicity, Stationary ergodic process), this is what I understand:
A random process is like a random variable, but it's outcomes are "waveforms" (a.k.a. functions) instead of numbers.
The ensemble average is the average of the outcomes of the random process, and therefore is another function (waveform) by itself. A given random variable will have one ensemble average (one function).
As opposed to the ensemble average, a random process can have many (possibly infinitely many) time averages, since every outcome of the random process (i.e., every waveform) has its own time average, which is the average value of the waveform. That is, given an outcome $x(t)$, it's time average will be given by
$$lim_{T to infty} dfrac{1}{T} int_{-frac{T}{2}}^{frac{T}{2}}x(t)dt$$
After reading those wikipedia pages, it seems that an ergodic process is a process that satisfies "the time average is equal to the ensemble average". But, which time average? To my understanding there are many time averages. Which one? All of them? Their mean? At least one of them? Or what?
Also, I would like to take a better look on the following examples, found in the linked wikipedia pages:
Example 1.
Suppose that we have two coins: one coin is fair and the other has two heads. We choose (at random) one of the coins, and then perform a sequence of independent tosses of our selected coin. Let X[n] denote the outcome of the nth toss, with 1 for heads and 0 for tails. Then the ensemble average is ½ (½ + 1) = ¾; yet the long-term average is ½ for the fair coin and 1 for the two-headed coin. Hence, this random process is not ergodic in mean.
Is this correct? (of course my doubt here is a consequence of the fact that I don't know the answer for my bold question above).
Example 2.
Ergodicity is where the ensemble average equals the time average. Each resistor has thermal noise associated with it and it depends on the temperature. Take N resistors (N should be very large) and plot the voltage across those resistors for a long period. For each resistor you will have a waveform. Calculate the average value of that waveform. This gives you the time average. You should also note that you have N waveforms as we have N resistors. These N plots are known as an ensemble. Now take a particular instant of time in all those plots and find the average value of the voltage. That gives you the ensemble average for each plot. If both ensemble average and time average are the same then it is ergodic.
It says "take a particular instant of time in all those plots". Does any instant works? Or rather, do I have to take "all of them" (one at a time)? Taking only one instant of time doesn't seem right... Rather, shouldn't I be taking some sort of limit to infinity? Also, it refers to "the time average" as if it was only one, but to my understanding there are N different time averages here, since there are N waveforms.
probability-theory statistics stochastic-processes average
probability-theory statistics stochastic-processes average
edited Apr 13 '17 at 12:21
Community♦
1
1
asked Oct 28 '16 at 12:14
Pedro APedro A
2,0311827
2,0311827
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Your question is intuitive, so I will try to answer through one very intuitive example.
Example:
Dynamics:
You have some money, say 100$, and we're playing a coin game with a fair coin. Each time the coin lands heads (H) we take your money and multiply them by 1.5 and every time the coin lands tails (T) we take your money and multiply them by 0.6.
Averages:
Let W(t) denote your wealth at time t, and let this process run for a finite time and take the average over your wealth for that time. This is the finite time-average:
$leftlangle W(t)rightrangle _{T} = frac{1}{T}sum_{t=0}^{T}W(t) $
In contrast, assume that we have N of these processes that we let run until t and then take the average over N. This is the finite ensemble-average of the observable N wealth processes.
$leftlangle W(t)rightrangle _{N} = frac{1}{N}sum_{i=1}^{N}W_i(t)$
Where i denotes the ith of N processes and the average is taken at time t. Letting $Trightarrowinfty$ and $Nrightarrowinfty$ you get the ensemble average (or expectation operator) and the time average.
Now that we have introduced the relevant dynamics (the coin game) and the averages, lets restate your definition of ergodicity - namely that the time average equals the ensemble average.
1) You are correct in asserting that each of the N coin tossing sequences have a time-average. However, in the limit, all of those N time averages are equal exactly because they are governed by the same dynamic. And they will all converge to the absorbing boundry (zero) with probability one. If you find that unintuitive, try to simulate a bunch of those processes, and plot their distribution. What you will get is a log-normal with diverging moments. E.g. there will be one-in-a-million who's wealth grows exponentially. The bulk of the ensemble will be close to zero.
2) Is the above wealth generating process ergodic? No. The ensemble average will predict infinite positive growth$^*$ while the time-average will converge to zero.$^{**}$ This is commonly known as the St. Petersburg paradox. Unrelated to your question, but both interesting and important: It is possible to create an ergodic observable using the logarithm which solves the St. Petersburg paradox.
Hope you can use this. If you want to see the formal proofs and simulations, I can recommend:
$^{*}$ Ensemble average: $frac12times0.6+frac12times1.5=1.05$ a number larger than one, reflecting positive growth of the ensemble.
$^{**}$ Time average: $x(t)=r_1^{n_1}r_2^{n_2}$ where $r_1$ and $r_2$ are the two rates and $n_1$ and $n_2$ are the frequences that the wealth process gets subjected to the rates. The limit of $x(t)$ is then for $trightarrowinfty$ $x(t)^{1/t} = (r_1r_2)^{1/2}$ or $sqrt(0.9)approx0.95$ e.g. a number less than one, which is decay in the long time limit.
Video
Article
$endgroup$
$begingroup$
Thanks, but I think you're making some sort of mistake here. First. Quoting you: "for a given dynamic or process all realisations will have the same time average in the limit". I still think this is wrong. Clearly we can fabricate random processes that will converge to different values in different outcomes. I simulated your own coin toss example 1000 times, each with 20000 coin tosses, and computed those 1000 time averages, getting results as low as 0.008333105152477038 and as high as 42.99348110477372.
$endgroup$
– Pedro A
Nov 22 '16 at 16:24
$begingroup$
Second. Quoting you: "The ensemble average will predict infinite positive growth" - my simulation does not confirm this. Rather, the ensemble average from the 10000th coin toss onwards yield 0 (the limit of computer precision was reached), so it is far from infinity (as far as you could get).
$endgroup$
– Pedro A
Nov 22 '16 at 16:26
$begingroup$
Oh, and the St. Petersburg paradox is very interesting, thanks for showing it to me; but I don't really see how it is related, because it can be seen simply as a random event, not a process, i.e., every time you attempt the game, you have one specific result; if you count each coin toss as a 'step', still the game ends in a specific moment (as soon as you get tails); while in a random process (such as your example), we choose when to stop looking, (as I did in my simulations, choosing to make 20000 coin tosses).
$endgroup$
– Pedro A
Nov 22 '16 at 16:38
$begingroup$
First: I can't seem to find the sentence "for a given dynamic or process all realisations will have the same time average in the limit"? You are right in saying that we can fabricate dynamical processes where the time-average is different. To be more precise, for this specific purely multiplicative dynamic, all the N processes will converge to the absorbing boundary (zero) with probability one in the limit. This is easy to show analytically.
$endgroup$
– tmo
Nov 23 '16 at 9:01
$begingroup$
Second: Your simulating is misleading you a little bit. It is too small. It is easy to show analytically that the ensemble average will predict infinite growth: Simply take the expectation to the specified dynamics and you will obtain a growth rate larger than one. Importantly, this is wrong. This is a result of non-ergodicity. I.e. the time-average does not equal the ensemble average. If you're keen on letting simulation guide you, which I think is a good idea, increase the ensemble (I did it on a million) and plot the distribution as it changes in time.
$endgroup$
– tmo
Nov 23 '16 at 9:06
|
show 5 more comments
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1989041%2funderstanding-the-definition-of-ergodicity-through-examples%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Your question is intuitive, so I will try to answer through one very intuitive example.
Example:
Dynamics:
You have some money, say 100$, and we're playing a coin game with a fair coin. Each time the coin lands heads (H) we take your money and multiply them by 1.5 and every time the coin lands tails (T) we take your money and multiply them by 0.6.
Averages:
Let W(t) denote your wealth at time t, and let this process run for a finite time and take the average over your wealth for that time. This is the finite time-average:
$leftlangle W(t)rightrangle _{T} = frac{1}{T}sum_{t=0}^{T}W(t) $
In contrast, assume that we have N of these processes that we let run until t and then take the average over N. This is the finite ensemble-average of the observable N wealth processes.
$leftlangle W(t)rightrangle _{N} = frac{1}{N}sum_{i=1}^{N}W_i(t)$
Where i denotes the ith of N processes and the average is taken at time t. Letting $Trightarrowinfty$ and $Nrightarrowinfty$ you get the ensemble average (or expectation operator) and the time average.
Now that we have introduced the relevant dynamics (the coin game) and the averages, lets restate your definition of ergodicity - namely that the time average equals the ensemble average.
1) You are correct in asserting that each of the N coin tossing sequences have a time-average. However, in the limit, all of those N time averages are equal exactly because they are governed by the same dynamic. And they will all converge to the absorbing boundry (zero) with probability one. If you find that unintuitive, try to simulate a bunch of those processes, and plot their distribution. What you will get is a log-normal with diverging moments. E.g. there will be one-in-a-million who's wealth grows exponentially. The bulk of the ensemble will be close to zero.
2) Is the above wealth generating process ergodic? No. The ensemble average will predict infinite positive growth$^*$ while the time-average will converge to zero.$^{**}$ This is commonly known as the St. Petersburg paradox. Unrelated to your question, but both interesting and important: It is possible to create an ergodic observable using the logarithm which solves the St. Petersburg paradox.
Hope you can use this. If you want to see the formal proofs and simulations, I can recommend:
$^{*}$ Ensemble average: $frac12times0.6+frac12times1.5=1.05$ a number larger than one, reflecting positive growth of the ensemble.
$^{**}$ Time average: $x(t)=r_1^{n_1}r_2^{n_2}$ where $r_1$ and $r_2$ are the two rates and $n_1$ and $n_2$ are the frequences that the wealth process gets subjected to the rates. The limit of $x(t)$ is then for $trightarrowinfty$ $x(t)^{1/t} = (r_1r_2)^{1/2}$ or $sqrt(0.9)approx0.95$ e.g. a number less than one, which is decay in the long time limit.
Video
Article
$endgroup$
$begingroup$
Thanks, but I think you're making some sort of mistake here. First. Quoting you: "for a given dynamic or process all realisations will have the same time average in the limit". I still think this is wrong. Clearly we can fabricate random processes that will converge to different values in different outcomes. I simulated your own coin toss example 1000 times, each with 20000 coin tosses, and computed those 1000 time averages, getting results as low as 0.008333105152477038 and as high as 42.99348110477372.
$endgroup$
– Pedro A
Nov 22 '16 at 16:24
$begingroup$
Second. Quoting you: "The ensemble average will predict infinite positive growth" - my simulation does not confirm this. Rather, the ensemble average from the 10000th coin toss onwards yield 0 (the limit of computer precision was reached), so it is far from infinity (as far as you could get).
$endgroup$
– Pedro A
Nov 22 '16 at 16:26
$begingroup$
Oh, and the St. Petersburg paradox is very interesting, thanks for showing it to me; but I don't really see how it is related, because it can be seen simply as a random event, not a process, i.e., every time you attempt the game, you have one specific result; if you count each coin toss as a 'step', still the game ends in a specific moment (as soon as you get tails); while in a random process (such as your example), we choose when to stop looking, (as I did in my simulations, choosing to make 20000 coin tosses).
$endgroup$
– Pedro A
Nov 22 '16 at 16:38
$begingroup$
First: I can't seem to find the sentence "for a given dynamic or process all realisations will have the same time average in the limit"? You are right in saying that we can fabricate dynamical processes where the time-average is different. To be more precise, for this specific purely multiplicative dynamic, all the N processes will converge to the absorbing boundary (zero) with probability one in the limit. This is easy to show analytically.
$endgroup$
– tmo
Nov 23 '16 at 9:01
$begingroup$
Second: Your simulating is misleading you a little bit. It is too small. It is easy to show analytically that the ensemble average will predict infinite growth: Simply take the expectation to the specified dynamics and you will obtain a growth rate larger than one. Importantly, this is wrong. This is a result of non-ergodicity. I.e. the time-average does not equal the ensemble average. If you're keen on letting simulation guide you, which I think is a good idea, increase the ensemble (I did it on a million) and plot the distribution as it changes in time.
$endgroup$
– tmo
Nov 23 '16 at 9:06
|
show 5 more comments
$begingroup$
Your question is intuitive, so I will try to answer through one very intuitive example.
Example:
Dynamics:
You have some money, say 100$, and we're playing a coin game with a fair coin. Each time the coin lands heads (H) we take your money and multiply them by 1.5 and every time the coin lands tails (T) we take your money and multiply them by 0.6.
Averages:
Let W(t) denote your wealth at time t, and let this process run for a finite time and take the average over your wealth for that time. This is the finite time-average:
$leftlangle W(t)rightrangle _{T} = frac{1}{T}sum_{t=0}^{T}W(t) $
In contrast, assume that we have N of these processes that we let run until t and then take the average over N. This is the finite ensemble-average of the observable N wealth processes.
$leftlangle W(t)rightrangle _{N} = frac{1}{N}sum_{i=1}^{N}W_i(t)$
Where i denotes the ith of N processes and the average is taken at time t. Letting $Trightarrowinfty$ and $Nrightarrowinfty$ you get the ensemble average (or expectation operator) and the time average.
Now that we have introduced the relevant dynamics (the coin game) and the averages, lets restate your definition of ergodicity - namely that the time average equals the ensemble average.
1) You are correct in asserting that each of the N coin tossing sequences have a time-average. However, in the limit, all of those N time averages are equal exactly because they are governed by the same dynamic. And they will all converge to the absorbing boundry (zero) with probability one. If you find that unintuitive, try to simulate a bunch of those processes, and plot their distribution. What you will get is a log-normal with diverging moments. E.g. there will be one-in-a-million who's wealth grows exponentially. The bulk of the ensemble will be close to zero.
2) Is the above wealth generating process ergodic? No. The ensemble average will predict infinite positive growth$^*$ while the time-average will converge to zero.$^{**}$ This is commonly known as the St. Petersburg paradox. Unrelated to your question, but both interesting and important: It is possible to create an ergodic observable using the logarithm which solves the St. Petersburg paradox.
Hope you can use this. If you want to see the formal proofs and simulations, I can recommend:
$^{*}$ Ensemble average: $frac12times0.6+frac12times1.5=1.05$ a number larger than one, reflecting positive growth of the ensemble.
$^{**}$ Time average: $x(t)=r_1^{n_1}r_2^{n_2}$ where $r_1$ and $r_2$ are the two rates and $n_1$ and $n_2$ are the frequences that the wealth process gets subjected to the rates. The limit of $x(t)$ is then for $trightarrowinfty$ $x(t)^{1/t} = (r_1r_2)^{1/2}$ or $sqrt(0.9)approx0.95$ e.g. a number less than one, which is decay in the long time limit.
Video
Article
$endgroup$
$begingroup$
Thanks, but I think you're making some sort of mistake here. First. Quoting you: "for a given dynamic or process all realisations will have the same time average in the limit". I still think this is wrong. Clearly we can fabricate random processes that will converge to different values in different outcomes. I simulated your own coin toss example 1000 times, each with 20000 coin tosses, and computed those 1000 time averages, getting results as low as 0.008333105152477038 and as high as 42.99348110477372.
$endgroup$
– Pedro A
Nov 22 '16 at 16:24
$begingroup$
Second. Quoting you: "The ensemble average will predict infinite positive growth" - my simulation does not confirm this. Rather, the ensemble average from the 10000th coin toss onwards yield 0 (the limit of computer precision was reached), so it is far from infinity (as far as you could get).
$endgroup$
– Pedro A
Nov 22 '16 at 16:26
$begingroup$
Oh, and the St. Petersburg paradox is very interesting, thanks for showing it to me; but I don't really see how it is related, because it can be seen simply as a random event, not a process, i.e., every time you attempt the game, you have one specific result; if you count each coin toss as a 'step', still the game ends in a specific moment (as soon as you get tails); while in a random process (such as your example), we choose when to stop looking, (as I did in my simulations, choosing to make 20000 coin tosses).
$endgroup$
– Pedro A
Nov 22 '16 at 16:38
$begingroup$
First: I can't seem to find the sentence "for a given dynamic or process all realisations will have the same time average in the limit"? You are right in saying that we can fabricate dynamical processes where the time-average is different. To be more precise, for this specific purely multiplicative dynamic, all the N processes will converge to the absorbing boundary (zero) with probability one in the limit. This is easy to show analytically.
$endgroup$
– tmo
Nov 23 '16 at 9:01
$begingroup$
Second: Your simulating is misleading you a little bit. It is too small. It is easy to show analytically that the ensemble average will predict infinite growth: Simply take the expectation to the specified dynamics and you will obtain a growth rate larger than one. Importantly, this is wrong. This is a result of non-ergodicity. I.e. the time-average does not equal the ensemble average. If you're keen on letting simulation guide you, which I think is a good idea, increase the ensemble (I did it on a million) and plot the distribution as it changes in time.
$endgroup$
– tmo
Nov 23 '16 at 9:06
|
show 5 more comments
$begingroup$
Your question is intuitive, so I will try to answer through one very intuitive example.
Example:
Dynamics:
You have some money, say 100$, and we're playing a coin game with a fair coin. Each time the coin lands heads (H) we take your money and multiply them by 1.5 and every time the coin lands tails (T) we take your money and multiply them by 0.6.
Averages:
Let W(t) denote your wealth at time t, and let this process run for a finite time and take the average over your wealth for that time. This is the finite time-average:
$leftlangle W(t)rightrangle _{T} = frac{1}{T}sum_{t=0}^{T}W(t) $
In contrast, assume that we have N of these processes that we let run until t and then take the average over N. This is the finite ensemble-average of the observable N wealth processes.
$leftlangle W(t)rightrangle _{N} = frac{1}{N}sum_{i=1}^{N}W_i(t)$
Where i denotes the ith of N processes and the average is taken at time t. Letting $Trightarrowinfty$ and $Nrightarrowinfty$ you get the ensemble average (or expectation operator) and the time average.
Now that we have introduced the relevant dynamics (the coin game) and the averages, lets restate your definition of ergodicity - namely that the time average equals the ensemble average.
1) You are correct in asserting that each of the N coin tossing sequences have a time-average. However, in the limit, all of those N time averages are equal exactly because they are governed by the same dynamic. And they will all converge to the absorbing boundry (zero) with probability one. If you find that unintuitive, try to simulate a bunch of those processes, and plot their distribution. What you will get is a log-normal with diverging moments. E.g. there will be one-in-a-million who's wealth grows exponentially. The bulk of the ensemble will be close to zero.
2) Is the above wealth generating process ergodic? No. The ensemble average will predict infinite positive growth$^*$ while the time-average will converge to zero.$^{**}$ This is commonly known as the St. Petersburg paradox. Unrelated to your question, but both interesting and important: It is possible to create an ergodic observable using the logarithm which solves the St. Petersburg paradox.
Hope you can use this. If you want to see the formal proofs and simulations, I can recommend:
$^{*}$ Ensemble average: $frac12times0.6+frac12times1.5=1.05$ a number larger than one, reflecting positive growth of the ensemble.
$^{**}$ Time average: $x(t)=r_1^{n_1}r_2^{n_2}$ where $r_1$ and $r_2$ are the two rates and $n_1$ and $n_2$ are the frequences that the wealth process gets subjected to the rates. The limit of $x(t)$ is then for $trightarrowinfty$ $x(t)^{1/t} = (r_1r_2)^{1/2}$ or $sqrt(0.9)approx0.95$ e.g. a number less than one, which is decay in the long time limit.
Video
Article
$endgroup$
Your question is intuitive, so I will try to answer through one very intuitive example.
Example:
Dynamics:
You have some money, say 100$, and we're playing a coin game with a fair coin. Each time the coin lands heads (H) we take your money and multiply them by 1.5 and every time the coin lands tails (T) we take your money and multiply them by 0.6.
Averages:
Let W(t) denote your wealth at time t, and let this process run for a finite time and take the average over your wealth for that time. This is the finite time-average:
$leftlangle W(t)rightrangle _{T} = frac{1}{T}sum_{t=0}^{T}W(t) $
In contrast, assume that we have N of these processes that we let run until t and then take the average over N. This is the finite ensemble-average of the observable N wealth processes.
$leftlangle W(t)rightrangle _{N} = frac{1}{N}sum_{i=1}^{N}W_i(t)$
Where i denotes the ith of N processes and the average is taken at time t. Letting $Trightarrowinfty$ and $Nrightarrowinfty$ you get the ensemble average (or expectation operator) and the time average.
Now that we have introduced the relevant dynamics (the coin game) and the averages, lets restate your definition of ergodicity - namely that the time average equals the ensemble average.
1) You are correct in asserting that each of the N coin tossing sequences have a time-average. However, in the limit, all of those N time averages are equal exactly because they are governed by the same dynamic. And they will all converge to the absorbing boundry (zero) with probability one. If you find that unintuitive, try to simulate a bunch of those processes, and plot their distribution. What you will get is a log-normal with diverging moments. E.g. there will be one-in-a-million who's wealth grows exponentially. The bulk of the ensemble will be close to zero.
2) Is the above wealth generating process ergodic? No. The ensemble average will predict infinite positive growth$^*$ while the time-average will converge to zero.$^{**}$ This is commonly known as the St. Petersburg paradox. Unrelated to your question, but both interesting and important: It is possible to create an ergodic observable using the logarithm which solves the St. Petersburg paradox.
Hope you can use this. If you want to see the formal proofs and simulations, I can recommend:
$^{*}$ Ensemble average: $frac12times0.6+frac12times1.5=1.05$ a number larger than one, reflecting positive growth of the ensemble.
$^{**}$ Time average: $x(t)=r_1^{n_1}r_2^{n_2}$ where $r_1$ and $r_2$ are the two rates and $n_1$ and $n_2$ are the frequences that the wealth process gets subjected to the rates. The limit of $x(t)$ is then for $trightarrowinfty$ $x(t)^{1/t} = (r_1r_2)^{1/2}$ or $sqrt(0.9)approx0.95$ e.g. a number less than one, which is decay in the long time limit.
Video
Article
edited Dec 20 '18 at 8:08
answered Nov 11 '16 at 14:21
tmotmo
1715
1715
$begingroup$
Thanks, but I think you're making some sort of mistake here. First. Quoting you: "for a given dynamic or process all realisations will have the same time average in the limit". I still think this is wrong. Clearly we can fabricate random processes that will converge to different values in different outcomes. I simulated your own coin toss example 1000 times, each with 20000 coin tosses, and computed those 1000 time averages, getting results as low as 0.008333105152477038 and as high as 42.99348110477372.
$endgroup$
– Pedro A
Nov 22 '16 at 16:24
$begingroup$
Second. Quoting you: "The ensemble average will predict infinite positive growth" - my simulation does not confirm this. Rather, the ensemble average from the 10000th coin toss onwards yield 0 (the limit of computer precision was reached), so it is far from infinity (as far as you could get).
$endgroup$
– Pedro A
Nov 22 '16 at 16:26
$begingroup$
Oh, and the St. Petersburg paradox is very interesting, thanks for showing it to me; but I don't really see how it is related, because it can be seen simply as a random event, not a process, i.e., every time you attempt the game, you have one specific result; if you count each coin toss as a 'step', still the game ends in a specific moment (as soon as you get tails); while in a random process (such as your example), we choose when to stop looking, (as I did in my simulations, choosing to make 20000 coin tosses).
$endgroup$
– Pedro A
Nov 22 '16 at 16:38
$begingroup$
First: I can't seem to find the sentence "for a given dynamic or process all realisations will have the same time average in the limit"? You are right in saying that we can fabricate dynamical processes where the time-average is different. To be more precise, for this specific purely multiplicative dynamic, all the N processes will converge to the absorbing boundary (zero) with probability one in the limit. This is easy to show analytically.
$endgroup$
– tmo
Nov 23 '16 at 9:01
$begingroup$
Second: Your simulating is misleading you a little bit. It is too small. It is easy to show analytically that the ensemble average will predict infinite growth: Simply take the expectation to the specified dynamics and you will obtain a growth rate larger than one. Importantly, this is wrong. This is a result of non-ergodicity. I.e. the time-average does not equal the ensemble average. If you're keen on letting simulation guide you, which I think is a good idea, increase the ensemble (I did it on a million) and plot the distribution as it changes in time.
$endgroup$
– tmo
Nov 23 '16 at 9:06
|
show 5 more comments
$begingroup$
Thanks, but I think you're making some sort of mistake here. First. Quoting you: "for a given dynamic or process all realisations will have the same time average in the limit". I still think this is wrong. Clearly we can fabricate random processes that will converge to different values in different outcomes. I simulated your own coin toss example 1000 times, each with 20000 coin tosses, and computed those 1000 time averages, getting results as low as 0.008333105152477038 and as high as 42.99348110477372.
$endgroup$
– Pedro A
Nov 22 '16 at 16:24
$begingroup$
Second. Quoting you: "The ensemble average will predict infinite positive growth" - my simulation does not confirm this. Rather, the ensemble average from the 10000th coin toss onwards yield 0 (the limit of computer precision was reached), so it is far from infinity (as far as you could get).
$endgroup$
– Pedro A
Nov 22 '16 at 16:26
$begingroup$
Oh, and the St. Petersburg paradox is very interesting, thanks for showing it to me; but I don't really see how it is related, because it can be seen simply as a random event, not a process, i.e., every time you attempt the game, you have one specific result; if you count each coin toss as a 'step', still the game ends in a specific moment (as soon as you get tails); while in a random process (such as your example), we choose when to stop looking, (as I did in my simulations, choosing to make 20000 coin tosses).
$endgroup$
– Pedro A
Nov 22 '16 at 16:38
$begingroup$
First: I can't seem to find the sentence "for a given dynamic or process all realisations will have the same time average in the limit"? You are right in saying that we can fabricate dynamical processes where the time-average is different. To be more precise, for this specific purely multiplicative dynamic, all the N processes will converge to the absorbing boundary (zero) with probability one in the limit. This is easy to show analytically.
$endgroup$
– tmo
Nov 23 '16 at 9:01
$begingroup$
Second: Your simulating is misleading you a little bit. It is too small. It is easy to show analytically that the ensemble average will predict infinite growth: Simply take the expectation to the specified dynamics and you will obtain a growth rate larger than one. Importantly, this is wrong. This is a result of non-ergodicity. I.e. the time-average does not equal the ensemble average. If you're keen on letting simulation guide you, which I think is a good idea, increase the ensemble (I did it on a million) and plot the distribution as it changes in time.
$endgroup$
– tmo
Nov 23 '16 at 9:06
$begingroup$
Thanks, but I think you're making some sort of mistake here. First. Quoting you: "for a given dynamic or process all realisations will have the same time average in the limit". I still think this is wrong. Clearly we can fabricate random processes that will converge to different values in different outcomes. I simulated your own coin toss example 1000 times, each with 20000 coin tosses, and computed those 1000 time averages, getting results as low as 0.008333105152477038 and as high as 42.99348110477372.
$endgroup$
– Pedro A
Nov 22 '16 at 16:24
$begingroup$
Thanks, but I think you're making some sort of mistake here. First. Quoting you: "for a given dynamic or process all realisations will have the same time average in the limit". I still think this is wrong. Clearly we can fabricate random processes that will converge to different values in different outcomes. I simulated your own coin toss example 1000 times, each with 20000 coin tosses, and computed those 1000 time averages, getting results as low as 0.008333105152477038 and as high as 42.99348110477372.
$endgroup$
– Pedro A
Nov 22 '16 at 16:24
$begingroup$
Second. Quoting you: "The ensemble average will predict infinite positive growth" - my simulation does not confirm this. Rather, the ensemble average from the 10000th coin toss onwards yield 0 (the limit of computer precision was reached), so it is far from infinity (as far as you could get).
$endgroup$
– Pedro A
Nov 22 '16 at 16:26
$begingroup$
Second. Quoting you: "The ensemble average will predict infinite positive growth" - my simulation does not confirm this. Rather, the ensemble average from the 10000th coin toss onwards yield 0 (the limit of computer precision was reached), so it is far from infinity (as far as you could get).
$endgroup$
– Pedro A
Nov 22 '16 at 16:26
$begingroup$
Oh, and the St. Petersburg paradox is very interesting, thanks for showing it to me; but I don't really see how it is related, because it can be seen simply as a random event, not a process, i.e., every time you attempt the game, you have one specific result; if you count each coin toss as a 'step', still the game ends in a specific moment (as soon as you get tails); while in a random process (such as your example), we choose when to stop looking, (as I did in my simulations, choosing to make 20000 coin tosses).
$endgroup$
– Pedro A
Nov 22 '16 at 16:38
$begingroup$
Oh, and the St. Petersburg paradox is very interesting, thanks for showing it to me; but I don't really see how it is related, because it can be seen simply as a random event, not a process, i.e., every time you attempt the game, you have one specific result; if you count each coin toss as a 'step', still the game ends in a specific moment (as soon as you get tails); while in a random process (such as your example), we choose when to stop looking, (as I did in my simulations, choosing to make 20000 coin tosses).
$endgroup$
– Pedro A
Nov 22 '16 at 16:38
$begingroup$
First: I can't seem to find the sentence "for a given dynamic or process all realisations will have the same time average in the limit"? You are right in saying that we can fabricate dynamical processes where the time-average is different. To be more precise, for this specific purely multiplicative dynamic, all the N processes will converge to the absorbing boundary (zero) with probability one in the limit. This is easy to show analytically.
$endgroup$
– tmo
Nov 23 '16 at 9:01
$begingroup$
First: I can't seem to find the sentence "for a given dynamic or process all realisations will have the same time average in the limit"? You are right in saying that we can fabricate dynamical processes where the time-average is different. To be more precise, for this specific purely multiplicative dynamic, all the N processes will converge to the absorbing boundary (zero) with probability one in the limit. This is easy to show analytically.
$endgroup$
– tmo
Nov 23 '16 at 9:01
$begingroup$
Second: Your simulating is misleading you a little bit. It is too small. It is easy to show analytically that the ensemble average will predict infinite growth: Simply take the expectation to the specified dynamics and you will obtain a growth rate larger than one. Importantly, this is wrong. This is a result of non-ergodicity. I.e. the time-average does not equal the ensemble average. If you're keen on letting simulation guide you, which I think is a good idea, increase the ensemble (I did it on a million) and plot the distribution as it changes in time.
$endgroup$
– tmo
Nov 23 '16 at 9:06
$begingroup$
Second: Your simulating is misleading you a little bit. It is too small. It is easy to show analytically that the ensemble average will predict infinite growth: Simply take the expectation to the specified dynamics and you will obtain a growth rate larger than one. Importantly, this is wrong. This is a result of non-ergodicity. I.e. the time-average does not equal the ensemble average. If you're keen on letting simulation guide you, which I think is a good idea, increase the ensemble (I did it on a million) and plot the distribution as it changes in time.
$endgroup$
– tmo
Nov 23 '16 at 9:06
|
show 5 more comments
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1989041%2funderstanding-the-definition-of-ergodicity-through-examples%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown