# 1. Let X0, X1,… be a Markov chain with states {0, 1, 2} and transition matrix P = ? ? 0.1 0.2 0.7. 1 answer below »

1. Let X0, X1,… be a Markov chain with states {0, 1, 2} and transition matrix P = ? ? 0.1 0.2 0.7 0.9 0.1 0.0 0.1 0.8 0.1 ? ? Assume that µ0 = (0.3, 0.4, 0.3). Find P(X0 = 0, X1 = 1, X2 = 2) and P(X0 = 0, X1 = 1, X2 = 1). 2. Let Y1, Y2,… be a sequence of iid observations such that P(Y = 0) = 0.1, P(Y = 1) = 0.3, P(Y = 2) = 0.2, P(Y = 3) = 0.4. Let X0 = 0 and let Xn = max{Y1,…,Yn}. Show that X0, X1,… is a Markov chain and find the transition matrix.