# The transition matrix of a Markov chain is: [ .96 .04 ] [ .27 .73 ] If it starts in state 2, what is the probability that it will be in state 2 after 2...

The transition matrix of a Markov chain is:

[ .96 .04 ]

[ .27 .73 ]

If it starts in state 2, what is the probability that it will be in state 2 after 2 transitions?

### 1 Answer | Add Yours

The transition matrix is `P=([.96,.04],[.27,.73])` and we are asked to find the probability that if we are in state 2 we are again in state 2 after 2 transitions.

The entries `p_(i,j)` are the probabilities of going from state i to state j. To find the probabilities after m transitions we look at `P^m` .

`P^2=([.9324,.0676],[.4563,.5437])` .

The probability of starting in state 2 and ending in state 2 after 2 transitions is the entry `p_(2,2)` in `P^2` which is .5437

-----------------------------------------------------------------

The probability is approximately .5437

---------------------------------------------------------------