For Markov Chains, the past and the future are indipendent, given the present. This suggests looking at the Markov Chain with time running backwards. On the other end, Convergence to equilibrium for ergodic chains is an irreversible process. So we must start at equilibrium.
Theorem Let be irreducible and have an invariant distribution . Suppose that define the reverse chain . Then is , where the new transition matrix is given by
and is also irreducible with invariant distribution .
Proof First we check that is a stochastic matrix:
next we check that is invariant for :
To prove that is we use this characterization of Markov Chains:
using the definition of ,
so that . Irreducibility follows from reversing a given path.