Markov model

From Simple English Wikipedia, the free encyclopedia
A Markov model. The numbers are the probabilities that it moves from one "state" to another. States are letters in this picture.

A Markov model is when things are predicted from other things that came immediately before. For example, if it usually snows after it rains, you might guess that it will snow tomorrow if it is raining today. A Markov model is part of probability theory.[1] Mostly, it tries to predict "states" from previous states. Some Markov models include the Markov chain and the Markov decision process.

Markov chain[change | change source]

The Markov chain is a simple Markov model.[1] It has "the Markov property," which means that future states of the variable are because of the previous states of the variable.

Hidden Markov model[change | change source]

An example of a hidden Markov model (sometimes called HMM). The states are at the top. The probability of a transfer from a state to a state and also between states and observations are shown as the numbers.

A hidden Markov model is a type of Markov chain. This is very similar to a basic Markov model, but when the state is only partially observable or noisily observable. In other words, past states are related to the state of the system, but they don't tell us enough to perfectly know the state we are trying to predict. This is used commonly. For example, it helps computers automatically turn speech audio into words.

Markov decision process[change | change source]

A Markov decision process is a type of Markov chain. A Markov decision process goes from one state to another. It uses a vector to do this.[2][3]

Uses[change | change source]

Markov chains are used to predict the future. They can predict prices[4] and wind power.[5]

References[change | change source]

  1. 1.0 1.1 Gagniuc, Paul A. (2017). Markov Chains: From Theory to Implementation and Experimentation. USA, NJ: John Wiley & Sons. pp. 1–256. ISBN 978-1-119-38755-8.
  2. Fine, S.; Singer, Y. (1998). "The hierarchical hidden markov model: Analysis and applications". Machine Learning. 32 (1): 41–62. doi:10.1023/A:1007469218079.
  3. Bui, H. H.; Venkatesh, S.; West, G. (2002). "Policy recognition in the abstract hidden markov model". Journal of Artificial Intelligence Research. 17: 451–499. doi:10.1613/jair.839. hdl:10536/DRO/DU:30044252.
  4. de Souza e Silva, E.G.; Legey, L.F.L.; de Souza e Silva, E.A. (2010). "Forecasting oil price trends using wavelets and hidden Markov models". Energy Economics. 32.
  5. Carpinone, A; Giorgio, M; Langella, R.; Testa, A. (2015). "Markov chain modeling for very-short-term wind power forecasting". Electric Power Systems Research. 122: 152–158. doi:10.1016/j.epsr.2014.12.025.