Markov Chain

What Does Markov Chain Mean?

A Markov chain is a mathematical process that transitions from one state to another within a finite number of possible states. It is a collection of different states and probabilities of a variable, where its future condition or state is substantially dependent on its immediate previous state.

Advertisements

A Markov chain is also known as a discrete time Markov chain (DTMC) or Markov process.

Techopedia Explains Markov Chain

Markov chains are primarily used to predict the future state of a variable or any object based on its past state. It applies probabilistic approaches in predicting the next state. Markov chains are exhibited using directed graphs, which define the current and past state and the probability of transitioning from one state to another.

Markov chains have several implementations in computing and Internet technologies. For example, the PageRank(r) formula employed by Google search uses a Markov chain to calculate the PageRank of a particular Web page. It is also used to predict user behavior on a website based on users’ previous preferences or interactions with it.

Advertisements

Related Terms

Latest General Computing Terms

Related Reading

Margaret Rouse

Margaret Rouse is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical, business audience. Over the past twenty years her explanations have appeared on TechTarget websites and she's been cited as an authority in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine and Discovery Magazine.Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages. If you have a suggestion for a new definition or how to improve a technical explanation, please email Margaret or contact her…