A Markov chain has the transition probability: P =     0    .42    .58           0      1       0         .54     0      .46  Find `P^(2)`

Expert Answers
embizze eNotes educator| Certified Educator

Given the transition matrix `P=([0,.42,.58],[0,1,0],[.54,0,.46])` we are asked to find `P^2` . (This is the matrix after 2 repetitions of the experiment.)


(Just use matrix multiplication or technology.)

The entries in this matrix give the probabilities of transitioning from one state to another after 2 repetitions of the experiment.

Thus the probability of transitioning from state 3 to state 2 originally was 0, but after another repetition it is .2268.