Weak

Theorem (Markov property) Let be a . Then, conditional on , is and is indipendent of the random variables .

Proof We need to show that if is an event determined by we have, then

Using theorem 1.1.1

So that .

To prove the first identity, note that , the events form a partiton (look Sigma algebra from partiton), so that we can always write an event as a union (finite or countable) of events . In our case the partition if the collection of paths of lenght : for all possible sequences of states.

It suffices to prove the identity for such events, applying the definition of conditional probability one gets

we can see that the right part is precisley the definition of conditional probability of to the event .

Strong

Theorem (Strong Markov property) Let be a , and let be a stopping time of . Then, conditional on and , is and is indipendent of the random variables .

Proof Let any event depending on . Then, . Also, since we are conditioning w.r.t. , the events partition the event space. Let’s use the notation for the “history” from time to . We need to prove:

then the fact that follows from the characterization path probability. Using the definition of conditional probability, we can see that we just need to prove

now we use our previous observations (and the definition of conditional probability) to write the first event as

now using the weak markov property on the deterministic time

Proof 2 Let’s use the notation for the “history” up to time of our DTMC, pick two states . Then

where we have used the definition of conditional probability. Since we are working with finite a.s. stopping time, using the law of total probability

we can substitute with , since we are taking the interection with the event , and use the regular Markov property!

now use the defintion of conditional probability again

where the sum is equal to one since is a stopping time, so it’s compleatly determined by the history up to time (one term is and the others ).