# Hidden Markov Model Calculation

#### betsyrocamora

##### New member
Can someone help me with this question.

Let us define a Markov hidden model with 2-states A and B, such that

P(X0=A)=0.6,P(X0=B)=0.4,P(X1=A/X0=A)=0.3,P(X1=B/X0=B)=0.8

what is the value of P(X3=A) ??

Last edited:

#### MarkFL

Staff member
Hello betsyrocamora,

Welcome to MHB! Can you show what you have tried and where you are stuck so our helpers have a better idea how best to help you?

#### Klaas van Aarsen

##### MHB Seeker
Staff member
Can someone help me with this question.

Let us define a Markov hidden model with 2-states A and B, such that

P(X0=A)=0.6,P(X0=B)=0.4,P(X1=A/X0=A)=0.3,P(X1=B/X0=B)=0.8

what is the value of P(X3=A) ??
Welcome to MHB, betsyrocamora! This looks like a question that is intended to learn what a hidden Markov model actually is.
Do your notes perhaps contain a worked example?
Or perhaps an example for a Markov chain?

#### betsyrocamora

##### New member
Welcome to MHB, betsyrocamora! This looks like a question that is intended to learn what a hidden Markov model actually is.
Do your notes perhaps contain a worked example?
Or perhaps an example for a Markov chain?
No, I know what is a hidden markov model, but with this one I am a little lost, I have tried to let j denote the state A. Used p3ij:=P(X3=j|X0=i) that satisfies the Chapman Kolmogorov equations and after that I think is the total probabilities ecuation but I am lost in there hahha

#### Klaas van Aarsen

##### MHB Seeker
Staff member
No, I know what is a hidden markov model, but with this one I am a little lost, I have tried to let j denote the state A. Used p3ij:=P(X3=j|X0=i) that satisfies the Chapman Kolmogorov equations and after that I think is the total probabilities ecuation but I am lost in there hahha
It seems to me you're making it unnecessarily complex.

From your given data we can deduce that $P(X_1=B\ |\ X_0=A)=0.7$ and $P(X_1=A\ |\ X_0=B)=0.2$.
You can write these numbers in a matrix M.
Then, assuming the Markov property being independent of n, we get:
$$\begin{bmatrix}{P(X_n=A) \\ P(X_n=B)}\end{bmatrix} = M^n \begin{bmatrix}{P(X_0=A) \\ P(X_0=B)}\end{bmatrix}$$