Def A Markov chain is said to admit a limiting distribution if the following conditions are satisfied:
- the limits
exists for all , and 2. they form a probability distribution on , i.e.
Remark If condition is satisfied and the state space is finite, then condition is always satisfied (in this case we can exchange the limit with the summation.)
Example We can compute the limiting distribution of the two state MC with transition matrix
we need to compute , to do so we can diagonalize it and get
if then , we have a trivial MC. With the hypotesis the limit exists
which also show that the limiting probability distribution doesn’t care about the starting state .