Markov Chain - Is state 2 periodic?

In summary, the conversation discusses a Markov chain with a given probability transition matrix and the questions of whether the chain is irreducible, periodic, and ergodic. The expert concludes that the chain is irreducible and ergodic, with no periodic states.
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! :eek:

Given the Markov chain $\{X_n, n \geq 1\}$ and the following probability transition matrix:
$\begin{pmatrix}
0 & 1/3 & 2/3\\
1/4 & 3/4 & 0\\
2/5 & 0 & 3/5
\end{pmatrix}$

All states communicate, so the chain is irreducible, isn't?

Could you tell me if the state $2$ is periodic?
 
Physics news on Phys.org
  • #2
mathmari said:
Hey! :eek:

Given the Markov chain $\{X_n, n \geq 1\}$ and the following probability transition matrix:
$\begin{pmatrix}
0 & 1/3 & 2/3\\
1/4 & 3/4 & 0\\
2/5 & 0 & 3/5
\end{pmatrix}$

All states communicate, so the chain is irreducible, isn't?

Could you tell me if the state $2$ is periodic?

Yes, the chain is irreducible!... the fact that $P_{2,2} \ne 0$ makes possible the return in the state 2 after any number of steps so that the state 2 is non periodic. In fact none of the states of the TM is periodic... Kind regards $\chi$ $\sigma$
 
  • #3
chisigma said:
Yes, the chain is irreducible!... the fact that $P_{2,2} \ne 0$ makes possible the return in the state 2 after any number of steps so that the state 2 is non periodic. In fact none of the states of the TM is periodic... Kind regards $\chi$ $\sigma$

Ok! And is the chain ergodic?
 
  • #4
mathmari said:
Ok! And is the chain ergodic?

The MC has all aperiodic states so that it is ergodic...

Kind regards

$\chi$ $\sigma$
 
  • #5
chisigma said:
The MC has all aperiodic states so that it is ergodic...

Kind regards

$\chi$ $\sigma$

Ok! Thank you for your answer! :eek:
 

Related to Markov Chain - Is state 2 periodic?

1. Is state 2 always periodic in a Markov Chain?

No, state 2 is not always periodic in a Markov Chain. It depends on the transition probabilities and the structure of the chain.

2. How do you determine if state 2 is periodic in a Markov Chain?

You can determine if state 2 is periodic by analyzing the transition matrix of the Markov Chain. If the greatest common divisor of all possible step lengths from state 2 is greater than 1, then state 2 is periodic.

3. What does it mean for a state to be periodic in a Markov Chain?

A periodic state in a Markov Chain is one that has a non-zero probability of returning to itself after a fixed number of steps. This can cause the chain to move in a cyclic manner, rather than reaching a steady state.

4. Can a Markov Chain have more than one periodic state?

Yes, a Markov Chain can have multiple periodic states. This occurs when the chain has more than one state with a greatest common divisor greater than 1.

5. What is the difference between a periodic and an aperiodic state in a Markov Chain?

A periodic state in a Markov Chain has a non-zero probability of returning to itself after a fixed number of steps, while an aperiodic state does not have this property. Aperiodic states are more likely to reach a steady state, while periodic states can cause the chain to move in a cyclic manner.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Precalculus Mathematics Homework Help
Replies
24
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
13
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
5K
Back
Top