Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Examples of Markov chains
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Random walk Markov chains === {{See also|Random walk}} ==== A center-biased random walk ==== Consider a [[random walk]] on the number line where, at each step, the position (call it ''x'') may change by +1 (to the right) or β1 (to the left) with probabilities: : <math>P_{\mathrm{move~left}} = \dfrac{1}{2} + \dfrac{1}{2} \left( \dfrac{x}{c+|x|} \right) </math> : <math>P_{\mathrm{move~right}} = 1 - P_{\mathrm{move~left}}</math> (where ''c'' is a constant greater than 0) For example, if the constant, ''c'', equals 1, the probabilities of a move to the left at positions ''x'' = β2,β1,0,1,2 are given by <math>\dfrac{1}{6},\dfrac{1}{4},\dfrac{1}{2},\dfrac{3}{4},\dfrac{5}{6}</math> respectively. The random walk has a centering effect that weakens as ''c'' increases. Since the probabilities depend only on the current position (value of ''x'') and not on any prior positions, this biased random walk satisfies the definition of a Markov chain. ==== Gambling ==== {{See also|Gambler's ruin}} Suppose that one starts with $10, and one wagers $1 on an unending, fair, coin toss indefinitely, or until all of the money is lost. If <math>X_n</math> represents the number of dollars one has after ''n'' tosses, with <math>X_0 = 10</math>, then the sequence <math>\{X_n : n \in \mathbb{N} \}</math> is a Markov process. If one knows that one has $12 now, then it would be expected that with even odds, one will either have $11 or $13 after the next toss. This guess is not improved by the added knowledge that one started with $10, then went up to $11, down to $10, up to $11, and then to $12. The fact that the guess is not improved by the knowledge of earlier tosses showcases the [[Markov property]], the memoryless property of a stochastic process.<ref name=":3">{{Cite book|title=Stochastic differential equations : an introduction with applications|last=Γksendal, B. K. (Bernt Karsten), 1945-|date=2003|publisher=Springer|isbn=3540047581|edition=6th|location=Berlin|oclc=52203046}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)