Homework Help

The transition matrix of a Markov chain is:          [ .96  .04 ]...

user profile pic

cbromm | (Level 1) Honors

Posted July 30, 2013 at 3:42 PM via web

dislike 1 like

The transition matrix of a Markov chain is:

         [ .96  .04 ]

         [ .27  .73 ]

If it starts in state 2, what is the probability that it will be in state 2 after 2 transitions?

1 Answer | Add Yours

user profile pic

embizze | High School Teacher | (Level 1) Educator Emeritus

Posted July 30, 2013 at 6:38 PM (Answer #1)

dislike 1 like

The transition matrix is `P=([.96,.04],[.27,.73])` and we are asked to find the probability that if we are in state 2 we are again in state 2 after 2 transitions.

The entries `p_(i,j)` are the probabilities of going from state i to state j. To find the probabilities after m transitions we look at `P^m` .

`P^2=([.9324,.0676],[.4563,.5437])` .

The probability of starting in state 2 and ending in state 2 after 2 transitions is the entry `p_(2,2)` in `P^2` which is .5437

-----------------------------------------------------------------

The probability is approximately .5437

---------------------------------------------------------------

Join to answer this question

Join a community of thousands of dedicated teachers and students.

Join eNotes