Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Random walk
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Gaussian random walk== A random walk having a step size that varies according to a [[normal distribution]] is used as a model for real-world time series data such as financial markets. Here, the step size is the inverse cumulative normal distribution <math>\Phi^{-1}(z,\mu,\sigma)</math> where 0 ≤ ''z'' ≤ 1 is a uniformly distributed random number, and μ and σ are the mean and standard deviations of the normal distribution, respectively. If μ is nonzero, the random walk will vary about a linear trend. If v<sub>s</sub> is the starting value of the random walk, the expected value after ''n'' steps will be v<sub>s</sub> + ''n''μ. For the special case where μ is equal to zero, after ''n'' steps, the translation distance's probability distribution is given by ''N''(0, ''n''σ<sup>2</sup>), where ''N''() is the notation for the normal distribution, ''n'' is the number of steps, and σ is from the inverse cumulative normal distribution as given above. Proof: The Gaussian random walk can be thought of as the sum of a sequence of independent and identically distributed random variables, X<sub>i</sub> from the inverse cumulative normal distribution with mean equal zero and σ of the original inverse cumulative normal distribution: :<math>Z = \sum_{i=0}^n {X_i},</math> but we have the distribution for the sum of two independent normally distributed random variables, {{math|1=''Z'' = ''X'' + ''Y''}}, is given by <math display="block">\mathcal{N}(\mu_X + \mu_Y, \sigma_X^2 + \sigma_Y^2)</math> [[Sum of normally distributed random variables|(see here)]]. In our case, {{math|1=μ<sub>X</sub> = μ<sub>Y</sub> = 0}} and {{math|1=σ<sup>2</sup><sub>X</sub> = σ<sup>2</sup><sub>Y</sub> = σ<sup>2</sup>}} yield <math display="block">\mathcal{N}(0, 2\sigma^2)</math> By induction, for ''n'' steps we have <math>Z \sim \mathcal{N}(0, n \sigma^2).</math> For steps distributed according to any distribution with zero mean and a finite variance (not necessarily just a normal distribution), the [[root mean square]] translation distance after ''n'' steps is (see [[Bienaymé's identity]]) :<math>\sqrt{Var(S_n)} = \sqrt{E[S_n^2]} = \sigma \sqrt{n}.</math> But for the Gaussian random walk, this is just the standard deviation of the translation distance's distribution after ''n'' steps. Hence, if μ is equal to zero, and since the root mean square(RMS) translation distance is one standard deviation, there is 68.27% probability that the RMS translation distance after ''n'' steps will fall between <math>\pm \sigma \sqrt{n}</math>. Likewise, there is 50% probability that the translation distance after ''n'' steps will fall between <math>\pm 0.6745 \sigma \sqrt{n}</math>. ===Number of distinct sites=== The number of distinct sites visited by a single random walker <math>S(t)</math> has been studied extensively for square and cubic lattices and for fractals.<ref name="WeissRubin1982">{{cite book|last1=Weiss|first1=George H.|title=Advances in Chemical Physics|last2=Rubin|first2=Robert J.|chapter=Random Walks: Theory and Selected Applications|volume=52|year=1982| pages=363–505|doi=10.1002/9780470142769.ch5|isbn=978-0-470-14276-9}}</ref><ref name="BlumenKlafter1986">{{cite book| last1=Blumen|first1=A. |title=Optical Spectroscopy of Glasses|last2=Klafter|first2=J.|last3=Zumofen|first3=G.|chapter=Models for Reaction Dynamics in Glasses|volume=1|year=1986|pages=199–265|doi=10.1007/978-94-009-4650-7_5|bibcode = 1986PCMLD...1..199B | series=Physic and Chemistry of Materials with Low-Dimensional Structures|isbn=978-94-010-8566-3}}</ref> This quantity is useful for the analysis of problems of trapping and kinetic reactions. It is also related to the vibrational density of states,<ref name="AlexanderOrbach1982">{{cite journal|last1=Alexander|first1=S.|last2=Orbach|first2=R.|title=Density of states on fractals: " fractons "|journal=Journal de Physique Lettres|volume=43|issue=17|year=1982| pages=625–631| doi=10.1051/jphyslet:019820043017062500 |s2cid=67757791|url=https://hal.archives-ouvertes.fr/jpa-00232103/file/ajp-jphyslet_1982_43_17_625_0.pdf }}</ref><ref name="RammalToulouse1983">{{cite journal|last1=Rammal|first1=R.| last2=Toulouse|first2=G.|title=Random walks on fractal structures and percolation clusters|journal=Journal de Physique Lettres |volume=44|issue=1|year=1983|pages=13–22|doi=10.1051/jphyslet:0198300440101300|url=https://hal.archives-ouvertes.fr/jpa-00232136/document}}</ref> diffusion reactions processes<ref>{{cite journal|last=Smoluchowski|first=M.V.|journal=Z. Phys. Chem. | number=29| pages=129–168|year=1917|title=Versuch einer mathematischen Theorie der Koagulationskinetik kolloider Lösungen}},{{cite book|last=Rice|first=S.A.|title=Diffusion-Limited Reactions|url=https://books.google.com/books?id=sWiyspAjelsC&pg=PP2|access-date=13 August 2013|series=Comprehensive Chemical Kinetics|volume=25|date=1 March 1985|publisher=Elsevier | isbn=978-0-444-42354-2}}</ref> and spread of populations in ecology.<ref name="Skellam1951">{{cite journal|last1=Skellam|first1=J. G.|title=Random Dispersal in Theoretical Populations|journal=Biometrika|volume=38|issue=1/2|year=1951|pages=196–218|doi=10.2307/2332328|jstor=2332328 | pmid=14848123}}</ref><ref>{{cite journal|last1=Skellam|first1=J. G.|title=Studies in Statistical Ecology: I. Spatial Pattern| journal=Biometrika|volume=39|issue=3/4|year=1952|pages=346–362|doi=10.2307/2334030|jstor=2334030}}</ref> ===Information rate=== The [[information rate]] of a Gaussian random walk with respect to the squared error distance, i.e. its quadratic [[rate distortion function]], is given parametrically by<ref>{{Cite journal |doi = 10.1109/TIT.1970.1054423|title = Information rates of Wiener processes|year = 1970|last1 = Berger|first1 = T.|journal = IEEE Transactions on Information Theory|volume = 16|issue = 2|pages = 134–139}}</ref> <math display="block"> R(D_\theta) = \frac{1}{2} \int_0^1 \max\{0, \log_2\left(S(\varphi)/\theta \right) \} \, d\varphi, </math> <math display="block"> D_\theta = \int_0^1 \min\{S(\varphi),\theta\} \, d\varphi, </math> where <math>S(\varphi) = \left(2 \sin (\pi \varphi/2) \right)^{-2}</math>. Therefore, it is impossible to encode <math>{\{Z_n\}_{n=1}^N}</math> using a [[binary code]] of less than <math>NR(D_\theta)</math> [[bit]]s and recover it with expected mean squared error less than <math>D_\theta</math>. On the other hand, for any <math>\varepsilon>0</math>, there exists an <math>N \in \mathbb N</math> large enough and a [[binary code]] of no more than <math>2^{N R(D_{\theta})}</math> distinct elements such that the expected mean squared error in recovering <math>{\{Z_n\}_{n=1}^N}</math> from this code is at most <math>D_\theta - \varepsilon</math>.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)