Markov chain - how to navigate in transition matrix?
$begingroup$
Let $X_0,X_1,...$ be a Markov Chain with transition matrix
$$P=begin{pmatrix} 0 & 1 & 0 \ 0 & 0 & 1 \ p & 1-p & 0
end{pmatrix} $$ for $0<p<1.$ Let $g$ be a function defined by $$
g(x)=left{ begin{array}{rcr}
0, & & text{if} x=&1 \
1, & & text{if} x=&2,3\ end{array} right. $$
Let $Y_n=g(X_n)$, for $ngeq 0$. Show that $Y_0,Y_1,...$ is not a
Markov chain.
My attempt:
So I want to show that
begin{align}
mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)=mathbb{P}(Y_n=j|Y_{n-1}=i).
end{align}
does not hold. Substituting in $X_i$ for $Y_i$ i get that
begin{align}
mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)&=mathbb{P}(g(X_n)=j|g(X_0)=x_0,...,g(x_{n-1})=i)\
&=...?
end{align}
How do I know which states to substitute my $X_i$'s for? I'm pretty sure I should use $P$ to do this but I have no idea how.
markov-chains conditional-probability
$endgroup$
|
show 1 more comment
$begingroup$
Let $X_0,X_1,...$ be a Markov Chain with transition matrix
$$P=begin{pmatrix} 0 & 1 & 0 \ 0 & 0 & 1 \ p & 1-p & 0
end{pmatrix} $$ for $0<p<1.$ Let $g$ be a function defined by $$
g(x)=left{ begin{array}{rcr}
0, & & text{if} x=&1 \
1, & & text{if} x=&2,3\ end{array} right. $$
Let $Y_n=g(X_n)$, for $ngeq 0$. Show that $Y_0,Y_1,...$ is not a
Markov chain.
My attempt:
So I want to show that
begin{align}
mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)=mathbb{P}(Y_n=j|Y_{n-1}=i).
end{align}
does not hold. Substituting in $X_i$ for $Y_i$ i get that
begin{align}
mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)&=mathbb{P}(g(X_n)=j|g(X_0)=x_0,...,g(x_{n-1})=i)\
&=...?
end{align}
How do I know which states to substitute my $X_i$'s for? I'm pretty sure I should use $P$ to do this but I have no idea how.
markov-chains conditional-probability
$endgroup$
1
$begingroup$
I think you have a typo in your definition of $g(x)$, since both options contain $x=1$..
$endgroup$
– pwerth
Dec 24 '18 at 21:38
$begingroup$
Thanks, editing!
$endgroup$
– Parseval
Dec 24 '18 at 21:39
$begingroup$
You want to show it is not a Markov chain, so you just need to find some $y_0, ldots, y_{n-2}$ and $i$ and $j$ such that the Markov condition fails to hold.
$endgroup$
– angryavian
Dec 24 '18 at 21:45
$begingroup$
$mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)=mathbb{P}(Y_n=j|Y_{n-1}=i)$ is known as the Markov property and is the defining property of Markov chains. So you want to show that this doesn't hold. In particular, you'll want to find a counterexample
$endgroup$
– pwerth
Dec 24 '18 at 21:45
$begingroup$
@pwerth I forgot to add the text "does not hold" below the Markov property, sorry about that. Oh, so I can simply choose any states that I like such that the proeprty fails to hold?
$endgroup$
– Parseval
Dec 24 '18 at 21:49
|
show 1 more comment
$begingroup$
Let $X_0,X_1,...$ be a Markov Chain with transition matrix
$$P=begin{pmatrix} 0 & 1 & 0 \ 0 & 0 & 1 \ p & 1-p & 0
end{pmatrix} $$ for $0<p<1.$ Let $g$ be a function defined by $$
g(x)=left{ begin{array}{rcr}
0, & & text{if} x=&1 \
1, & & text{if} x=&2,3\ end{array} right. $$
Let $Y_n=g(X_n)$, for $ngeq 0$. Show that $Y_0,Y_1,...$ is not a
Markov chain.
My attempt:
So I want to show that
begin{align}
mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)=mathbb{P}(Y_n=j|Y_{n-1}=i).
end{align}
does not hold. Substituting in $X_i$ for $Y_i$ i get that
begin{align}
mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)&=mathbb{P}(g(X_n)=j|g(X_0)=x_0,...,g(x_{n-1})=i)\
&=...?
end{align}
How do I know which states to substitute my $X_i$'s for? I'm pretty sure I should use $P$ to do this but I have no idea how.
markov-chains conditional-probability
$endgroup$
Let $X_0,X_1,...$ be a Markov Chain with transition matrix
$$P=begin{pmatrix} 0 & 1 & 0 \ 0 & 0 & 1 \ p & 1-p & 0
end{pmatrix} $$ for $0<p<1.$ Let $g$ be a function defined by $$
g(x)=left{ begin{array}{rcr}
0, & & text{if} x=&1 \
1, & & text{if} x=&2,3\ end{array} right. $$
Let $Y_n=g(X_n)$, for $ngeq 0$. Show that $Y_0,Y_1,...$ is not a
Markov chain.
My attempt:
So I want to show that
begin{align}
mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)=mathbb{P}(Y_n=j|Y_{n-1}=i).
end{align}
does not hold. Substituting in $X_i$ for $Y_i$ i get that
begin{align}
mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)&=mathbb{P}(g(X_n)=j|g(X_0)=x_0,...,g(x_{n-1})=i)\
&=...?
end{align}
How do I know which states to substitute my $X_i$'s for? I'm pretty sure I should use $P$ to do this but I have no idea how.
markov-chains conditional-probability
markov-chains conditional-probability
edited Dec 24 '18 at 21:47
Parseval
asked Dec 24 '18 at 21:37
ParsevalParseval
2,9261719
2,9261719
1
$begingroup$
I think you have a typo in your definition of $g(x)$, since both options contain $x=1$..
$endgroup$
– pwerth
Dec 24 '18 at 21:38
$begingroup$
Thanks, editing!
$endgroup$
– Parseval
Dec 24 '18 at 21:39
$begingroup$
You want to show it is not a Markov chain, so you just need to find some $y_0, ldots, y_{n-2}$ and $i$ and $j$ such that the Markov condition fails to hold.
$endgroup$
– angryavian
Dec 24 '18 at 21:45
$begingroup$
$mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)=mathbb{P}(Y_n=j|Y_{n-1}=i)$ is known as the Markov property and is the defining property of Markov chains. So you want to show that this doesn't hold. In particular, you'll want to find a counterexample
$endgroup$
– pwerth
Dec 24 '18 at 21:45
$begingroup$
@pwerth I forgot to add the text "does not hold" below the Markov property, sorry about that. Oh, so I can simply choose any states that I like such that the proeprty fails to hold?
$endgroup$
– Parseval
Dec 24 '18 at 21:49
|
show 1 more comment
1
$begingroup$
I think you have a typo in your definition of $g(x)$, since both options contain $x=1$..
$endgroup$
– pwerth
Dec 24 '18 at 21:38
$begingroup$
Thanks, editing!
$endgroup$
– Parseval
Dec 24 '18 at 21:39
$begingroup$
You want to show it is not a Markov chain, so you just need to find some $y_0, ldots, y_{n-2}$ and $i$ and $j$ such that the Markov condition fails to hold.
$endgroup$
– angryavian
Dec 24 '18 at 21:45
$begingroup$
$mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)=mathbb{P}(Y_n=j|Y_{n-1}=i)$ is known as the Markov property and is the defining property of Markov chains. So you want to show that this doesn't hold. In particular, you'll want to find a counterexample
$endgroup$
– pwerth
Dec 24 '18 at 21:45
$begingroup$
@pwerth I forgot to add the text "does not hold" below the Markov property, sorry about that. Oh, so I can simply choose any states that I like such that the proeprty fails to hold?
$endgroup$
– Parseval
Dec 24 '18 at 21:49
1
1
$begingroup$
I think you have a typo in your definition of $g(x)$, since both options contain $x=1$..
$endgroup$
– pwerth
Dec 24 '18 at 21:38
$begingroup$
I think you have a typo in your definition of $g(x)$, since both options contain $x=1$..
$endgroup$
– pwerth
Dec 24 '18 at 21:38
$begingroup$
Thanks, editing!
$endgroup$
– Parseval
Dec 24 '18 at 21:39
$begingroup$
Thanks, editing!
$endgroup$
– Parseval
Dec 24 '18 at 21:39
$begingroup$
You want to show it is not a Markov chain, so you just need to find some $y_0, ldots, y_{n-2}$ and $i$ and $j$ such that the Markov condition fails to hold.
$endgroup$
– angryavian
Dec 24 '18 at 21:45
$begingroup$
You want to show it is not a Markov chain, so you just need to find some $y_0, ldots, y_{n-2}$ and $i$ and $j$ such that the Markov condition fails to hold.
$endgroup$
– angryavian
Dec 24 '18 at 21:45
$begingroup$
$mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)=mathbb{P}(Y_n=j|Y_{n-1}=i)$ is known as the Markov property and is the defining property of Markov chains. So you want to show that this doesn't hold. In particular, you'll want to find a counterexample
$endgroup$
– pwerth
Dec 24 '18 at 21:45
$begingroup$
$mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)=mathbb{P}(Y_n=j|Y_{n-1}=i)$ is known as the Markov property and is the defining property of Markov chains. So you want to show that this doesn't hold. In particular, you'll want to find a counterexample
$endgroup$
– pwerth
Dec 24 '18 at 21:45
$begingroup$
@pwerth I forgot to add the text "does not hold" below the Markov property, sorry about that. Oh, so I can simply choose any states that I like such that the proeprty fails to hold?
$endgroup$
– Parseval
Dec 24 '18 at 21:49
$begingroup$
@pwerth I forgot to add the text "does not hold" below the Markov property, sorry about that. Oh, so I can simply choose any states that I like such that the proeprty fails to hold?
$endgroup$
– Parseval
Dec 24 '18 at 21:49
|
show 1 more comment
1 Answer
1
active
oldest
votes
$begingroup$
Consider $mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0)$. Given that $Y_{0}=0$, we must have $X_{0}=1$, from which the only possibility is $X_{1}=2$ and $X_{2}=3$. Therefore $mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0) = 1$.
Now consider $mathbb{P}(Y_{2}=1|Y_{1}=1)$. If $X_{1}=3$, then $Y_{1}=1$ but $mathbb{P}(X_{2}=1)=p$ and if $X_{2}=1$ then $Y_{2}=0$. Therefore this probability is not $1$ (there is a nonzero probability that $Y_{2}$ will equal $0$).
Since $mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0)neqmathbb{P}(Y_{2}=1|Y_{1}=1)$, the chain is not Markov.
The intuitive reason that $Y_{n}$ is not a Markov chain is because probabilities related to its values depend on knowledge of multiple prior states, whereas the Markov property means that probabilities of values of the chain depend only on the previous state.
$endgroup$
$begingroup$
Great explanation!
$endgroup$
– Parseval
Dec 24 '18 at 22:40
$begingroup$
Thank you! Glad it helped
$endgroup$
– pwerth
Dec 24 '18 at 22:47
$begingroup$
May I ask, why did you only go to $Y_2$ and not $Y_{16}$ or so? Did you just use the lowest possible index just for convenience?
$endgroup$
– Parseval
Dec 24 '18 at 22:47
$begingroup$
Yes, I simply used the lowest possible index for clarity. But if you found a counterexample going up to $Y_{16}$, that would still be perfectly valid
$endgroup$
– pwerth
Dec 24 '18 at 22:52
1
$begingroup$
From your question: "Let $Y_{n} = g(X_{n}), $for $ngeq 0"$
$endgroup$
– pwerth
Dec 24 '18 at 22:55
|
show 1 more comment
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3051658%2fmarkov-chain-how-to-navigate-in-transition-matrix%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Consider $mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0)$. Given that $Y_{0}=0$, we must have $X_{0}=1$, from which the only possibility is $X_{1}=2$ and $X_{2}=3$. Therefore $mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0) = 1$.
Now consider $mathbb{P}(Y_{2}=1|Y_{1}=1)$. If $X_{1}=3$, then $Y_{1}=1$ but $mathbb{P}(X_{2}=1)=p$ and if $X_{2}=1$ then $Y_{2}=0$. Therefore this probability is not $1$ (there is a nonzero probability that $Y_{2}$ will equal $0$).
Since $mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0)neqmathbb{P}(Y_{2}=1|Y_{1}=1)$, the chain is not Markov.
The intuitive reason that $Y_{n}$ is not a Markov chain is because probabilities related to its values depend on knowledge of multiple prior states, whereas the Markov property means that probabilities of values of the chain depend only on the previous state.
$endgroup$
$begingroup$
Great explanation!
$endgroup$
– Parseval
Dec 24 '18 at 22:40
$begingroup$
Thank you! Glad it helped
$endgroup$
– pwerth
Dec 24 '18 at 22:47
$begingroup$
May I ask, why did you only go to $Y_2$ and not $Y_{16}$ or so? Did you just use the lowest possible index just for convenience?
$endgroup$
– Parseval
Dec 24 '18 at 22:47
$begingroup$
Yes, I simply used the lowest possible index for clarity. But if you found a counterexample going up to $Y_{16}$, that would still be perfectly valid
$endgroup$
– pwerth
Dec 24 '18 at 22:52
1
$begingroup$
From your question: "Let $Y_{n} = g(X_{n}), $for $ngeq 0"$
$endgroup$
– pwerth
Dec 24 '18 at 22:55
|
show 1 more comment
$begingroup$
Consider $mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0)$. Given that $Y_{0}=0$, we must have $X_{0}=1$, from which the only possibility is $X_{1}=2$ and $X_{2}=3$. Therefore $mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0) = 1$.
Now consider $mathbb{P}(Y_{2}=1|Y_{1}=1)$. If $X_{1}=3$, then $Y_{1}=1$ but $mathbb{P}(X_{2}=1)=p$ and if $X_{2}=1$ then $Y_{2}=0$. Therefore this probability is not $1$ (there is a nonzero probability that $Y_{2}$ will equal $0$).
Since $mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0)neqmathbb{P}(Y_{2}=1|Y_{1}=1)$, the chain is not Markov.
The intuitive reason that $Y_{n}$ is not a Markov chain is because probabilities related to its values depend on knowledge of multiple prior states, whereas the Markov property means that probabilities of values of the chain depend only on the previous state.
$endgroup$
$begingroup$
Great explanation!
$endgroup$
– Parseval
Dec 24 '18 at 22:40
$begingroup$
Thank you! Glad it helped
$endgroup$
– pwerth
Dec 24 '18 at 22:47
$begingroup$
May I ask, why did you only go to $Y_2$ and not $Y_{16}$ or so? Did you just use the lowest possible index just for convenience?
$endgroup$
– Parseval
Dec 24 '18 at 22:47
$begingroup$
Yes, I simply used the lowest possible index for clarity. But if you found a counterexample going up to $Y_{16}$, that would still be perfectly valid
$endgroup$
– pwerth
Dec 24 '18 at 22:52
1
$begingroup$
From your question: "Let $Y_{n} = g(X_{n}), $for $ngeq 0"$
$endgroup$
– pwerth
Dec 24 '18 at 22:55
|
show 1 more comment
$begingroup$
Consider $mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0)$. Given that $Y_{0}=0$, we must have $X_{0}=1$, from which the only possibility is $X_{1}=2$ and $X_{2}=3$. Therefore $mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0) = 1$.
Now consider $mathbb{P}(Y_{2}=1|Y_{1}=1)$. If $X_{1}=3$, then $Y_{1}=1$ but $mathbb{P}(X_{2}=1)=p$ and if $X_{2}=1$ then $Y_{2}=0$. Therefore this probability is not $1$ (there is a nonzero probability that $Y_{2}$ will equal $0$).
Since $mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0)neqmathbb{P}(Y_{2}=1|Y_{1}=1)$, the chain is not Markov.
The intuitive reason that $Y_{n}$ is not a Markov chain is because probabilities related to its values depend on knowledge of multiple prior states, whereas the Markov property means that probabilities of values of the chain depend only on the previous state.
$endgroup$
Consider $mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0)$. Given that $Y_{0}=0$, we must have $X_{0}=1$, from which the only possibility is $X_{1}=2$ and $X_{2}=3$. Therefore $mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0) = 1$.
Now consider $mathbb{P}(Y_{2}=1|Y_{1}=1)$. If $X_{1}=3$, then $Y_{1}=1$ but $mathbb{P}(X_{2}=1)=p$ and if $X_{2}=1$ then $Y_{2}=0$. Therefore this probability is not $1$ (there is a nonzero probability that $Y_{2}$ will equal $0$).
Since $mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0)neqmathbb{P}(Y_{2}=1|Y_{1}=1)$, the chain is not Markov.
The intuitive reason that $Y_{n}$ is not a Markov chain is because probabilities related to its values depend on knowledge of multiple prior states, whereas the Markov property means that probabilities of values of the chain depend only on the previous state.
edited Dec 24 '18 at 22:01
answered Dec 24 '18 at 21:53
pwerthpwerth
3,243417
3,243417
$begingroup$
Great explanation!
$endgroup$
– Parseval
Dec 24 '18 at 22:40
$begingroup$
Thank you! Glad it helped
$endgroup$
– pwerth
Dec 24 '18 at 22:47
$begingroup$
May I ask, why did you only go to $Y_2$ and not $Y_{16}$ or so? Did you just use the lowest possible index just for convenience?
$endgroup$
– Parseval
Dec 24 '18 at 22:47
$begingroup$
Yes, I simply used the lowest possible index for clarity. But if you found a counterexample going up to $Y_{16}$, that would still be perfectly valid
$endgroup$
– pwerth
Dec 24 '18 at 22:52
1
$begingroup$
From your question: "Let $Y_{n} = g(X_{n}), $for $ngeq 0"$
$endgroup$
– pwerth
Dec 24 '18 at 22:55
|
show 1 more comment
$begingroup$
Great explanation!
$endgroup$
– Parseval
Dec 24 '18 at 22:40
$begingroup$
Thank you! Glad it helped
$endgroup$
– pwerth
Dec 24 '18 at 22:47
$begingroup$
May I ask, why did you only go to $Y_2$ and not $Y_{16}$ or so? Did you just use the lowest possible index just for convenience?
$endgroup$
– Parseval
Dec 24 '18 at 22:47
$begingroup$
Yes, I simply used the lowest possible index for clarity. But if you found a counterexample going up to $Y_{16}$, that would still be perfectly valid
$endgroup$
– pwerth
Dec 24 '18 at 22:52
1
$begingroup$
From your question: "Let $Y_{n} = g(X_{n}), $for $ngeq 0"$
$endgroup$
– pwerth
Dec 24 '18 at 22:55
$begingroup$
Great explanation!
$endgroup$
– Parseval
Dec 24 '18 at 22:40
$begingroup$
Great explanation!
$endgroup$
– Parseval
Dec 24 '18 at 22:40
$begingroup$
Thank you! Glad it helped
$endgroup$
– pwerth
Dec 24 '18 at 22:47
$begingroup$
Thank you! Glad it helped
$endgroup$
– pwerth
Dec 24 '18 at 22:47
$begingroup$
May I ask, why did you only go to $Y_2$ and not $Y_{16}$ or so? Did you just use the lowest possible index just for convenience?
$endgroup$
– Parseval
Dec 24 '18 at 22:47
$begingroup$
May I ask, why did you only go to $Y_2$ and not $Y_{16}$ or so? Did you just use the lowest possible index just for convenience?
$endgroup$
– Parseval
Dec 24 '18 at 22:47
$begingroup$
Yes, I simply used the lowest possible index for clarity. But if you found a counterexample going up to $Y_{16}$, that would still be perfectly valid
$endgroup$
– pwerth
Dec 24 '18 at 22:52
$begingroup$
Yes, I simply used the lowest possible index for clarity. But if you found a counterexample going up to $Y_{16}$, that would still be perfectly valid
$endgroup$
– pwerth
Dec 24 '18 at 22:52
1
1
$begingroup$
From your question: "Let $Y_{n} = g(X_{n}), $for $ngeq 0"$
$endgroup$
– pwerth
Dec 24 '18 at 22:55
$begingroup$
From your question: "Let $Y_{n} = g(X_{n}), $for $ngeq 0"$
$endgroup$
– pwerth
Dec 24 '18 at 22:55
|
show 1 more comment
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3051658%2fmarkov-chain-how-to-navigate-in-transition-matrix%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
I think you have a typo in your definition of $g(x)$, since both options contain $x=1$..
$endgroup$
– pwerth
Dec 24 '18 at 21:38
$begingroup$
Thanks, editing!
$endgroup$
– Parseval
Dec 24 '18 at 21:39
$begingroup$
You want to show it is not a Markov chain, so you just need to find some $y_0, ldots, y_{n-2}$ and $i$ and $j$ such that the Markov condition fails to hold.
$endgroup$
– angryavian
Dec 24 '18 at 21:45
$begingroup$
$mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)=mathbb{P}(Y_n=j|Y_{n-1}=i)$ is known as the Markov property and is the defining property of Markov chains. So you want to show that this doesn't hold. In particular, you'll want to find a counterexample
$endgroup$
– pwerth
Dec 24 '18 at 21:45
$begingroup$
@pwerth I forgot to add the text "does not hold" below the Markov property, sorry about that. Oh, so I can simply choose any states that I like such that the proeprty fails to hold?
$endgroup$
– Parseval
Dec 24 '18 at 21:49