1. Math
  2. Statistics And Probability
  3. the following is the transition probability matrix of a markov...

Question: the following is the transition probability matrix of a markov...

Question details

The following is the transition probability matrix of a Markov chain with states 1,2,3,4

0.4 0.3 0.2 0.1

P = 0.2 0.2 0.2 0.4

0.25 0.25 0.5 0

0.2 0.1 0.4 0.3

If X0=1,

(a) find the probability that state 3 is entered before state 4;

(b) find the mean number of transitions until either state 3 or state 4 is entered.

Solution by an expert tutor
Blurred Solution
This question has been solved
Subscribe to see this solution