Homework Help

A Markov chain has the transition probability: P =     0    .42    .58      ...

user profile pic

cbromm | Honors

Posted July 29, 2013 at 1:00 AM via web

dislike 1 like

A Markov chain has the transition probability:

P =     0    .42    .58

          0      1       0

        .54     0      .46 

Find `P^(2)`

1 Answer | Add Yours

user profile pic

embizze | High School Teacher | (Level 1) Educator Emeritus

Posted July 29, 2013 at 1:48 AM (Answer #1)

dislike 1 like

Given the transition matrix `P=([0,.42,.58],[0,1,0],[.54,0,.46])` we are asked to find `P^2` . (This is the matrix after 2 repetitions of the experiment.)

`P^2=P*P=([.3132,.42,.2668],[0,1,0],[.2404,.2268,.5248])`

(Just use matrix multiplication or technology.)

The entries in this matrix give the probabilities of transitioning from one state to another after 2 repetitions of the experiment.

Thus the probability of transitioning from state 3 to state 2 originally was 0, but after another repetition it is .2268.

Join to answer this question

Join a community of thousands of dedicated teachers and students.

Join eNotes