Markov Chain

Why Trust Techopedia

What Does Markov Chain Mean?

A Markov chain is a mathematical process that transitions from one state to another within a finite number of possible states. It is a collection of different states and probabilities of a variable, where its future condition or state is substantially dependent on its immediate previous state.

Advertisements

A Markov chain is also known as a discrete time Markov chain (DTMC) or Markov process.

Techopedia Explains Markov Chain

Markov chains are primarily used to predict the future state of a variable or any object based on its past state. It applies probabilistic approaches in predicting the next state. Markov chains are exhibited using directed graphs, which define the current and past state and the probability of transitioning from one state to another.

Markov chains have several implementations in computing and Internet technologies. For example, the PageRank(r) formula employed by Google search uses a Markov chain to calculate the PageRank of a particular Web page. It is also used to predict user behavior on a website based on users’ previous preferences or interactions with it.

Advertisements

Related Terms

Margaret Rouse
Technology Expert
Margaret Rouse
Technology Expert

Margaret is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages.