Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Random sequence
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
The concept of a '''random sequence''' is essential in [[probability theory]] and [[statistics]]. The concept generally relies on the notion of a [[sequence]] of [[random variable]]s and many statistical discussions begin with the words "let ''X''<sub>1</sub>,...,''X<sub>n</sub>'' be independent random variables...". Yet as [[D. H. Lehmer]] stated in 1951: "A random sequence is a vague notion... in which each term is unpredictable to the uninitiated and whose digits pass a certain number of tests traditional with statisticians".<ref>"What is meant by the word Random" in ''Mathematics and common sense'' by Philip J. Davis 2006 {{ISBN|1-56881-270-1}} pages 180-182</ref> [[Probability axioms|Axiomatic probability theory]] ''deliberately'' avoids a definition of a random sequence.<ref>''Inevitable Randomness in Discrete Mathematics'' by József Beck 2009 {{ISBN|0-8218-4756-2}} page 44</ref> Traditional probability theory does not state if a specific sequence is random, but generally proceeds to discuss the properties of random variables and stochastic sequences assuming some definition of randomness. The [[Nicolas Bourbaki|Bourbaki school]] considered the statement "let us consider a random sequence" an [[abuse of terminology|abuse of language]].<ref>''Algorithms: main ideas and applications'' by Vladimir Andreevich Uspenskiĭ, Alekseĭ, Lʹvovich Semenov 1993 Springer {{ISBN|0-7923-2210-X}} page 166</ref> ==Early history== [[Émile Borel]] was one of the first mathematicians to formally address randomness in 1909.<ref>E. Borel, ''Les probabilites denombrables et leurs applications arithmetique'' Rend. Circ. Mat. Palermo 27 (1909) 247–271</ref> In 1919 [[Richard von Mises]] gave the first definition of [[algorithmic randomness]], which was inspired by the law of large numbers, although he used the term ''collective'' rather than random sequence. Using the concept of the [[impossibility of a gambling system]], von Mises defined an infinite sequence of zeros and ones as random if it is not biased by having the ''frequency stability property'' i.e. the frequency of zeros goes to 1/2 and every sub-sequence we can select from it by a "proper" method of selection is also not biased.<ref>Laurant Bienvenu "Kolmogorov Loveland Stochasticity" in STACS 2007: 24th Annual Symposium on Theoretical Aspects of Computer Science by Wolfgang Thomas {{ISBN|3-540-70917-7}} page 260</ref> The sub-sequence selection criterion imposed by von Mises is important, because although 0101010101... is not biased, by selecting the odd positions, we get 000000... which is not random. Von Mises never totally formalized his definition of a proper selection rule for sub-sequences, but in 1940 [[Alonzo Church]] defined it as any [[recursion#Functional recursion|recursive function]] which having read the first N elements of the sequence decides if it wants to select element number ''N'' + 1. Church was a pioneer in the field of computable functions, and the definition he made relied on the [[Church Turing Thesis]] for computability.<ref>{{cite journal | last1 = Church | first1 = Alonzo | author-link = Alonzo Church | year = 1940 | title = On the Concept of Random Sequence | doi = 10.1090/S0002-9904-1940-07154-X| journal = Bull. Amer. Math. Soc. | volume = 46 | issue = 2| pages = 130–136 | doi-access = free }}</ref> This definition is often called ''Mises–Church randomness''. ==Modern approaches== During the 20th century various technical approaches to defining random sequences were developed and now three distinct paradigms can be identified. In the mid 1960s, [[A. N. Kolmogorov]] and [[D. W. Loveland]] independently proposed a more permissive selection rule.<ref>A. N. Kolmogorov, ''Three approaches to the quantitative definition of information'' Problems of Information and Transmission, 1(1):1–7, 1965.</ref><ref>D.W. Loveland, ''A new interpretation of von Mises' concept of random sequence'' Z. Math. Logik Grundlagen Math 12 (1966) 279–294</ref> In their view Church's recursive function definition was too restrictive in that it read the elements in order. Instead they proposed a rule based on a partially computable process which having read ''any'' ''N'' elements of the sequence, decides if it wants to select another element which has not been read yet. This definition is often called ''Kolmogorov–Loveland stochasticity''. But this method was considered too weak by [[Alexander Shen]] who showed that there is a Kolmogorov–Loveland stochastic sequence which does not conform to the general notion of randomness. In 1966 [[Per Martin-Löf]] introduced a new notion which is now generally considered the most satisfactory notion of [[algorithmic randomness]]. His original definition involved measure theory, but it was later shown that it can be expressed in terms of [[Kolmogorov complexity]]. Kolmogorov's definition of a random string was that it is random if it has no description shorter than itself via a [[universal Turing machine]].<ref>''An introduction to Kolmogorov complexity and its applications'' by Ming Li, P. M. B. Vitányi 1997 0387948686 pages 149–151</ref> Three basic paradigms for dealing with random sequences have now emerged:<ref>R. Downey, ''Some Recent Progress in Algorithmic Randomness'' in Mathematical foundations of computer science 2004: by Jiří Fiala, Václav Koubek 2004 {{ISBN|3-540-22823-3}} page 44</ref> :* The ''frequency / measure-theoretic'' approach. This approach started with the work of Richard von Mises and Alonzo Church. In the 1960s Per Martin-Löf noticed that the sets coding such frequency-based stochastic properties are a special kind of [[measure zero]] sets, and that a more general and smooth definition can be obtained by considering all effectively measure zero sets. :* The ''complexity / compressibility'' approach. This paradigm was championed by A. N. Kolmogorov along with contributions from [[Leonid Levin]] and [[Gregory Chaitin]]. For finite sequences, Kolmogorov defines randomness of a binary string of length ''n'' as the entropy (or [[Kolmogorov complexity]]) normalized by the length ''n''. In other words, if the Kolmogorov complexity of the string is close to ''n'', it is very random; if the complexity is far below ''n'', it is not so random. The dual concept of randomness is compressibility ‒ the more random a sequence is, the less compressible, and vice versa. :* The ''predictability'' approach. This paradigm is due to [[Claus P. Schnorr]] and uses a slightly different definition of constructive [[Martingale (probability theory)|martingales]] than martingales used in traditional probability theory.<ref>{{cite journal | last1 = Schnorr | first1 = C. P. | year = 1971 | title = A unified approach to the definition of a random sequence | journal = Mathematical Systems Theory | volume = 5 | issue = 3| pages = 246–258 | doi=10.1007/bf01694181| s2cid = 8931514 }}</ref> Schnorr showed how the existence of a selective betting strategy implied the existence of a selection rule for a biased sub-sequence. If one only requires a recursive martingale to succeed on a sequence instead of constructively succeed on a sequence, then one gets the concept of recursive randomness.{{explain|reason="Recursive randomness" is not defined or mentioned in any of the cited sources.|date=December 2020}} [[Yongge Wang]] showed<ref>Yongge Wang: Randomness and Complexity. PhD Thesis, 1996. http://webpages.uncc.edu/yonwang/papers/IPL97.pdf</ref><ref>{{cite journal | last1 = Wang | first1 = Yongge | year = 1999 | title = A separation of two randomness concepts | journal = Information Processing Letters | volume = 69 | issue = 3| pages = 115–118 | doi=10.1016/S0020-0190(98)00202-6| citeseerx = 10.1.1.46.199 }}</ref> that recursive randomness concept is different from Schnorr's randomness concept.{{explain|reason=So then what is "Schnorr's randomness concept"?|date=December 2020}} In most cases, theorems relating the three paradigms (often equivalence) have been proven.<ref>Wolfgang Merkle, ''Kolmogorov Loveland Stochasticity'' in Automata, languages and programming: 29th international colloquium, ICALP 2002, by Peter Widmayer et al. {{ISBN|3-540-43864-5}} page 391</ref> ==See also== * [[Randomness]] * [[History of randomness]] * [[Random number generator]] * [[Seven states of randomness]] * [[Statistical randomness]] ==References== * Sergio B. Volchan [http://www.maa.org/programs/maa-awards/writing-awards/what-is-a-random-sequence ''What Is a Random Sequence?''] ''The American Mathematical Monthly'', Vol. 109, 2002, pp. 46–63 ==Notes== {{Reflist}} ==External links== * {{springer|title=Random sequence|id=p/r077350}} * [https://www.youtube.com/watch?v=H2lJLXS3AYM Video] on frequency stability. Why humans can't "guess" randomly * [http://www.ciphersbyritter.com/RES/RANDTEST.HTM#vonNeumann63 Randomness tests by Terry Ritter] {{DEFAULTSORT:Random Sequence}} [[Category:Sequences and series]] [[Category:Statistical randomness]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Cite journal
(
edit
)
Template:Explain
(
edit
)
Template:ISBN
(
edit
)
Template:Reflist
(
edit
)
Template:Springer
(
edit
)