[markov chain] reading expected value from the transition matrix

In summary, the conversation discusses a problem involving a transition matrix of a Markov chain and the expected value of reaching a specific state for the first time. The speaker mentions attending a course on stochastic processes but not remembering the solution. Another speaker suggests modifying the matrix and provides a formula for computing the expected value.
  • #1
rahl___
10
0
Hello there,

yet another trivial problem:
We have a transition matrix of some markov chain: [tex]\left[\begin{array}{ccc}e_{11}&...&e_{1n}\\...&...&...\\e_{n1}&...&e_{nn}\end{array}\right][/tex].
at the beginning our chain is in the state [tex]e_1[/tex]. let T be the moment, when the chain reaches [tex]e_n[/tex] for the first time. What is the expected value of T?

I've attended the 'stochastic process' course some time ago but the only thing I remember is that this kind of problem is really easy to compute, there is some simple pattern for this I presume.

thanks for your help,
rahl.
 
Physics news on Phys.org
  • #2
I don't think there's an easy answer to that. You can modify the matrix so, the chain will remain in state [itex] e_n [/itex] if it gets there, and compute [tex] \sum_{k=1}^\infty k (i M^k - i M^{k-1})[/tex]

where i is the initial state of (1, 0, ... , 0) and M the transition matrix. You need the last component of this of course.
 
  • #3


Hello rahl,

Thank you for your question. The expected value of T in this case can be calculated by summing the products of the transition probabilities and the number of steps it takes to reach the absorbing state (e_n). This can be expressed as a mathematical formula: E[T] = \sum_{n=1}^{n} n * e_{1n}. In other words, we multiply each transition probability by the number of steps it takes to reach the absorbing state and then sum those values together. This is a common method for calculating expected values in Markov chains and can be easily applied to this problem.

I hope this helps. If you have any further questions, please don't hesitate to ask.

Best,
 

Related to [markov chain] reading expected value from the transition matrix

What is a Markov Chain?

A Markov chain is a mathematical model that describes the probability of transitioning between a finite set of states over time. It is a type of stochastic process that is used to model random events.

What is a transition matrix?

A transition matrix is a square matrix that represents the probabilities of transitioning from one state to another in a Markov chain. Each row of the matrix represents the current state, while each column represents the next state.

How do you read expected value from a transition matrix?

The expected value from a transition matrix is calculated by multiplying the transition matrix by the initial state vector. The resulting vector represents the expected number of transitions to each state after one time step.

What is the significance of the expected value in a Markov chain?

The expected value in a Markov chain represents the long-term behavior of the system. It provides insight into the average number of transitions to each state and can be used to make predictions about the future behavior of the system.

How is the expected value affected by changes in the transition matrix?

The expected value is directly affected by changes in the transition matrix. A higher probability of transitioning to a certain state will result in a higher expected value for that state, while a lower probability will result in a lower expected value. Changes in the transition matrix can also affect the long-term behavior of the system and the stability of the Markov chain.

Similar threads

  • Calculus and Beyond Homework Help
Replies
3
Views
2K
Replies
9
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
Replies
1
Views
1K
Replies
6
Views
2K
  • General Math
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
3K
  • Calculus and Beyond Homework Help
Replies
1
Views
3K
  • Calculus and Beyond Homework Help
Replies
2
Views
14K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
4K
Back
Top