Theorem Let a finite subset of the state space of an irreducible Markov chain . Define the the stopping time

Then there exists constant such that

in particular and .

Proof Let such that so that there is a path with just and positive probability:

For all let

Let

call the last event . Then

using the Markov property

repeating this with now , we get

where

Counterexample If is not finite, there are counterexamples. Consider the state space , let . The transition probabilty are defined as follows:

this chain, if started in tends towards , however every vertex in is connected to , this measn that . But the of the is note defined, their is zero.

Clearly. starting from any , the probability of never reaching is the same as always going down:

since

in particular .