Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Examples of Markov chains
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== A simple weather model === The probabilities of weather conditions (modeled as either rainy or sunny), given the weather on the preceding day, can be represented by a [[Stochastic matrix|transition matrix]]: : <math> P = \begin{bmatrix} 0.9 & 0.1 \\ 0.5 & 0.5 \end{bmatrix} </math> The matrix ''P'' represents the weather model in which a sunny day is 90% likely to be followed by another sunny day, and a rainy day is 50% likely to be followed by another rainy day. The columns can be labelled "sunny" and "rainy", and the rows can be labelled in the same order. [[File:Markov Chain weather model matrix as a graph.png|thumbnail|The above matrix as a graph.]] (''P'')<sub>''i j''</sub> is the probability that, if a given day is of type ''i'', it will be followed by a day of type ''j''. Notice that the rows of ''P'' sum to 1: this is because ''P'' is a [[stochastic matrix]]. ==== Predicting the weather ==== The weather on day 0 (today) is known to be sunny. This is represented by an initial state vector in which the "sunny" entry is 100%, and the "rainy" entry is 0%: : <math> \mathbf{x}^{(0)} = \begin{bmatrix} 1 & 0 \end{bmatrix} </math> The weather on day 1 (tomorrow) can be predicted by multiplying the state vector from day 0 by the transition matrix: : <math> \mathbf{x}^{(1)} = \mathbf{x}^{(0)} P = \begin{bmatrix} 1 & 0 \end{bmatrix} \begin{bmatrix} 0.9 & 0.1 \\ 0.5 & 0.5 \end{bmatrix} = \begin{bmatrix} 0.9 & 0.1 \end{bmatrix} </math> Thus, there is a 90% chance that day 1 will also be sunny. The weather on day 2 (the day after tomorrow) can be predicted in the same way, from the state vector we computed for day 1: : <math> \mathbf{x}^{(2)} =\mathbf{x}^{(1)} P = \mathbf{x}^{(0)} P^2 = \begin{bmatrix} 1 & 0 \end{bmatrix} \begin{bmatrix} 0.9 & 0.1 \\ 0.5 & 0.5 \end{bmatrix}^2 = \begin{bmatrix} 0.86 & 0.14 \end{bmatrix} </math> or : <math> \mathbf{x}^{(2)} =\mathbf{x}^{(1)} P = \begin{bmatrix} 0.9 & 0.1 \end{bmatrix} \begin{bmatrix} 0.9 & 0.1 \\ 0.5 & 0.5 \end{bmatrix} = \begin{bmatrix} 0.86 & 0.14 \end{bmatrix} </math> General rules for day ''n'' are: : <math> \mathbf{x}^{(n)} = \mathbf{x}^{(n-1)} P </math> : <math> \mathbf{x}^{(n)} = \mathbf{x}^{(0)} P^n </math> ==== Steady state of the weather ==== In this example, predictions for the weather on more distant days change less and less on each subsequent day and tend towards a [https://www.math.drexel.edu/~jwd25/LM_SPRING_07/lectures/Markov.html steady state vector]. This vector represents the probabilities of sunny and rainy weather on all days, and is independent of the initial weather. The steady state vector is defined as: :<math> \mathbf{q} = \lim_{n \to \infty} \mathbf{x}^{(n)} </math> but converges to a strictly positive vector only if ''P'' is a regular transition matrix (that is, there is at least one ''P''<sup>''n''</sup> with all non-zero entries). Since '''q''' is independent from initial conditions, it must be unchanged when transformed by ''P''.<ref name=":1">{{Cite book |last=Van Kampen |first=N.G. |url=https://archive.org/details/stochasticproces00kamp_024 |title=Stochastic Processes in Physics and Chemistry |publisher=North Holland Elsevier |year=2007 |isbn=978-0-444-52965-7 |location=NL |pages=[https://archive.org/details/stochasticproces00kamp_024/page/n79 73]β95 |url-access=limited}}</ref> This makes it an [[eigenvector]] (with [[eigenvalue]] 1), and means it can be derived from ''P''. In layman's terms, the steady-state vector is the vector that, when we multiply it by ''P'', we get the exact same vector back.<ref>{{cite web |url=https://bloomingtontutors.com/blog/going-steady-state-with-markov-processes |title=Going steady (state) with Markov processes |publisher=Bloomington Tutors}}</ref> For the weather example, we can use this to set up a matrix equation: :<math> \begin{align} P & = \begin{bmatrix} 0.9 & 0.1 \\ 0.5 & 0.5 \end{bmatrix} \\ \mathbf{q} P & = \mathbf{q} & & \text{(} \mathbf{q} \text{ is unchanged by } P \text{.)} \\ & = \mathbf{q}I \\ \mathbf{q} (P - I) & = \mathbf{0} \\ \mathbf{q} \left( \begin{bmatrix} 0.9 & 0.1 \\ 0.5 & 0.5 \end{bmatrix} - \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \right) & = \mathbf{0} \\ \mathbf{q} \begin{bmatrix} -0.1 & 0.1 \\ 0.5 & -0.5 \end{bmatrix} & = \mathbf{0} \\ \begin{bmatrix} q_1 & q_2 \end{bmatrix} \begin{bmatrix} -0.1 & 0.1 \\ 0.5 & -0.5 \end{bmatrix} & = \begin{bmatrix} 0 & 0 \end{bmatrix} \\ -0.1 q_1 + 0.5 q_2 &= 0 \end{align} </math> and since they are a probability vector we know that :<math> q_1 + q_2 = 1. </math> Solving this pair of simultaneous equations gives the steady state vector: :<math> \begin{bmatrix} q_1 & q_2 \end{bmatrix} = \begin{bmatrix} 0.833 & 0.167 \end{bmatrix} </math> In conclusion, in the long term about 83.3% of days are sunny. Not all Markov processes have a steady state vector. In particular, the transition matrix must be '''regular'''. Otherwise, the state vectors will oscillate over time without converging.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)