A Gentle Introduction to Markov Chains

Swagata Ashwani
3 min readJun 5, 2022

--

Photo by It'sTomo on Unsplash

What are Markov chains?

A Markov chain is a mathematical stochastic process defined as a collection of random variables, that transition from one state to another according to certain probabilistic rules.

What is the Markov Property?

These set of transition satisfies the Markov Property, which states that the probability of transitioning to any particular state is dependent solely on the current state and time elapsed, and not on the sequence of state that preceded it. This unique characteristic of Markov processes render them memory less.

Markov Model

We can think of it as a sequence of directed graphs, where the edges of graph n are labeled by the probabilities of going from one state at time n to the other states at time n+1, Pr(Xn+1 = x | Xn = xn) i.e, probability of going to state Xn+1 given value of state Xn. The same information is represented by the transition matrix from time n to time n+1. Every state in the state space is included once as a row and again as a column, and each cell in the matrix tells you the probability of transitioning from its row’s state to its column’s state.

If the Markov chain has N possible states, the matrix will be an N x N matrix, such that entry (I, J) is the probability of transitioning from state I to state J.

Let’s understand this using an example:

Lets say we have a Kid who eats sandwiches, pizza or noodles only.

From historic data, if he had pizza on first day,the next day it is 60% likely he will have noodles, 20% he will have sandwich and 20% chance he will have pizza again.

These transitions are shown in the diagram below —

The Markov Chain depicted in the state diagram has 3 possible states: pizza, sandwich and noodles. So, the transition matrix will be 3 x 3 matrix. Notice, the arrows exiting a state always sums up to exactly 1, similarly the entries in each row in the transition matrix must add up to exactly 1 — representing probability distribution. In the transition matrix, the cells do the same job that the arrows do in the state diagram.

Now that you have seen the example, this should give you an idea of the different concepts related to a Markov chain.

--

--

Swagata Ashwani
Swagata Ashwani

Written by Swagata Ashwani

I love talking Data! Data Scientist with a passion for finding optimized solutions in the AI space.Follow me here — https://www.linkedin.com/in/swagata-ashwani/

No responses yet