I have a simple Markov Chain with two states A and B. After each day, if the Markov chain is at State A, the probability to stay at State A is 0.6, and to move to State B is 0.4. If the Markov Chain is at State B, the probability to stay at State A is 0.4, and to move to State B is 0.6. Basically, in a stationary distribution, the Markov chain will stay at State A for 60% of the time and stay at State B for 40% of the time.
My question is that: If the Markov chain starts at State A, after N days (say 100 days), what is probability the Markov chain will transition from A to B and back to A for n times, n = 1, 2, ..., 50?
Is there a theory to calculate this? Any idea or giving me a reference for this is appreciated. Thanks a lot.
My question is that: If the Markov chain starts at State A, after N days (say 100 days), what is probability the Markov chain will transition from A to B and back to A for n times, n = 1, 2, ..., 50?
Is there a theory to calculate this? Any idea or giving me a reference for this is appreciated. Thanks a lot.