Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Markov chain
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===A non-Markov example=== Suppose that there is a coin purse containing five coins worth 25Β’, five coins worth 10Β’ and five coins worth 5Β’, and one by one, coins are randomly drawn from the purse and are set on a table. If <math>X_n</math> represents the total value of the coins set on the table after {{mvar|n}} draws, with <math>X_0 = 0</math>, then the sequence <math>\{X_n : n\in\mathbb{N}\}</math> is ''not'' a Markov process. To see why this is the case, suppose that in the first six draws, all five nickels and a quarter are drawn. Thus <math>X_6 = \$0.50</math>. If we know not just <math>X_6</math>, but the earlier values as well, then we can determine which coins have been drawn, and we know that the next coin will not be a nickel; so we can determine that <math>X_7 \geq \$0.60</math> with probability 1. But if we do not know the earlier values, then based only on the value <math>X_6</math> we might guess that we had drawn four dimes and two nickels, in which case it would certainly be possible to draw another nickel next. Thus, our guesses about <math>X_7</math> are impacted by our knowledge of values prior to <math>X_6</math>. However, it is possible to model this scenario as a Markov process. Instead of defining <math>X_n</math> to represent the ''total value'' of the coins on the table, we could define <math>X_n</math> to represent the ''count'' of the various coin types on the table. For instance, <math>X_6 = 1,0,5</math> could be defined to represent the state where there is one quarter, zero dimes, and five nickels on the table after 6 one-by-one draws. This new model could be represented by <math>6\times 6\times 6=216</math> possible states, where each state represents the number of coins of each type (from 0 to 5) that are on the table. (Not all of these states are reachable within 6 draws.) Suppose that the first draw results in state <math>X_1 = 0,1,0</math>. The probability of achieving <math>X_2</math> now depends on <math>X_1</math>; for example, the state <math>X_2 = 1,0,1</math> is not possible. After the second draw, the third draw depends on which coins have so far been drawn, but no longer only on the coins that were drawn for the first state (since probabilistically important information has since been added to the scenario). In this way, the likelihood of the <math>X_n = i,j,k</math> state depends exclusively on the outcome of the <math>X_{n-1}= \ell,m,p</math> state.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)