Discrete time Markov chains

Definition and basic properties

Let be a countable set. Each is called a state and is called the state-space.

We say that is a measure on if for all . If the total mass equals , then we call a distribution.

We will work with a probability space . A random variable taking values in is a measurable function .

Suppose we set

this defines the distribution of . We think of as modelling a random state which takes value with probability .

It’s natural to define a matrix which entries correspont to the probability to get from state to state . It follows that each row shoud be a distribution, i.e. . Such matrices are called stochastic.

Def A stochastic process is a Markov Chain with initial distribution and a transition matrix if

  1. has a distribution ;
  2. for , conditional , has distribution and is independent of .

Written explicitly, these condition are

  1. ;

We say that is for short.

DUBBIO ???

Il professore aggiunge che gli stati devono avere probabilità positiva???

My answer: non vuole condizionare su eventi di probabilità nulla!

Meaning: Markov chains are discrete time stochastic processes with “no memory”. The r.v. is the state at time .

We can give a more comprehensive description of Markov Chains, i.e. an equivalent characterization.

Theorem 1.1.1 A Discrete time random process is is and only if for all

Proof Suppose is , then

which follows from the definition of conditional probability. The first term is, by property equal to . By iterating times we get our thesis.

It’s clear by induction that for each

in particular this implies

By computing the conditional probability

this is Markov property .