Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Probability
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== History == {{Main|History of probability}} {{Further|History of statistics}} The scientific study of probability is a modern development of mathematics. [[Gambling]] shows that there has been an interest in quantifying the ideas of probability throughout history, but exact mathematical descriptions arose much later. There are reasons for the slow development of the mathematics of probability. Whereas games of chance provided the impetus for the mathematical study of probability, fundamental issues {{NoteTag|In the context of the book that this is quoted from, it is the theory of probability and the logic behind it that governs the phenomena of such things compared to rash predictions that rely on pure luck or mythological arguments such as gods of luck helping the winner of the game.}} are still obscured by superstitions.<ref>[[John E. Freund|Freund, John.]] (1973) ''Introduction to Probability''. Dickenson {{isbn|978-0-8221-0078-2}} (p. 1)</ref> According to [[Richard Jeffrey]], "Before the middle of the seventeenth century, the term 'probable' (Latin ''probabilis'') meant ''approvable'', and was applied in that sense, univocally, to opinion and to action. A probable action or opinion was one such as sensible people would undertake or hold, in the circumstances."<ref name="Jeffrey">Jeffrey, R.C., ''Probability and the Art of Judgment,'' Cambridge University Press. (1992). pp. 54–55 . {{isbn|0-521-39459-7}}</ref> However, in legal contexts especially, 'probable' could also apply to propositions for which there was good evidence.<ref name="Franklin">Franklin, J. (2001) ''The Science of Conjecture: Evidence and Probability Before Pascal,'' Johns Hopkins University Press. (pp. 22, 113, 127)</ref> [[File:Cardano.jpg|thumb|140px|[[Gerolamo Cardano]] (16th century)]] [[File:Christiaan Huygens-painting.jpeg|thumb|140px|[[Christiaan Huygens]] published one of the first books on probability (17th century).]] The sixteenth-century [[Italians|Italian]] polymath [[Gerolamo Cardano]] demonstrated the efficacy of defining [[odds]] as the ratio of favourable to unfavourable outcomes (which implies that the probability of an event is given by the ratio of favourable outcomes to the total number of possible outcomes<ref>{{cite web| url = http://www.columbia.edu/~pg2113/index_files/Gorroochurn-Some%20Laws.pdf| title = ''Some laws and problems in classical probability and how Cardano anticipated them'' Gorrochum, P. ''Chance'' magazine 2012}}</ref>). Aside from the elementary work by Cardano, the doctrine of probabilities dates to the correspondence of [[Pierre de Fermat]] and [[Blaise Pascal]] (1654). [[Christiaan Huygens]] (1657) gave the earliest known scientific treatment of the subject.<ref>{{citation |url=http://www.secondmoment.org/articles/probability.php |publisher=Second Moment |access-date=2008-05-23 |title=A Brief History of Probability |last=Abrams |first=William |archive-date=24 July 2017 |archive-url=https://web.archive.org/web/20170724052656/http://www.secondmoment.org/articles/probability.php |url-status=dead }}</ref> [[Jakob Bernoulli]]'s ''[[Ars Conjectandi]]'' (posthumous, 1713) and [[Abraham de Moivre]]'s ''[[The Doctrine of Chances|Doctrine of Chances]]'' (1718) treated the subject as a branch of mathematics.<ref>{{Cite book | last1 = Ivancevic | first1 = Vladimir G. | last2 = Ivancevic | first2 = Tijana T. | title = Quantum leap : from Dirac and Feynman, across the universe, to human body and mind | year = 2008 | publisher = World Scientific | location = Singapore; Hackensack, NJ | isbn = 978-981-281-927-7 | page = 16 }}</ref> See [[Ian Hacking]]'s ''The Emergence of Probability''<ref name=Emergence/> and [[James Franklin (philosopher)|James Franklin's]] ''The Science of Conjecture''<ref>{{cite book|last1=Franklin|first1=James|title=The Science of Conjecture: Evidence and Probability Before Pascal|date=2001|publisher=Johns Hopkins University Press|isbn=978-0-8018-6569-5}}</ref> for histories of the early development of the very concept of mathematical probability. The [[theory of errors]] may be traced back to [[Roger Cotes]]'s ''Opera Miscellanea'' (posthumous, 1722), but a memoir prepared by [[Thomas Simpson]] in 1755 (printed 1756) first applied the theory to the discussion of errors of observation.<ref>{{Cite journal|last=Shoesmith|first=Eddie|date=November 1985|title=Thomas Simpson and the arithmetic mean|journal=Historia Mathematica|language=en|volume=12|issue=4|pages=352–355|doi=10.1016/0315-0860(85)90044-8|doi-access=free}}</ref> The reprint (1757) of this memoir lays down the axioms that positive and negative errors are equally probable, and that certain assignable limits define the range of all errors. Simpson also discusses continuous errors and describes a probability curve. The first two laws of error that were proposed both originated with [[Pierre-Simon Laplace]]. The first law was published in 1774, and stated that the frequency of an error could be expressed as an exponential function of the numerical magnitude of the error{{snd}}disregarding sign. The second law of error was proposed in 1778 by Laplace, and stated that the frequency of the error is an exponential function of the square of the error.<ref name=Wilson1923>Wilson EB (1923) "First and second laws of error". [[Journal of the American Statistical Association]], 18, 143</ref> The second law of error is called the normal distribution or the Gauss law. "It is difficult historically to attribute that law to Gauss, who in spite of his well-known precocity had probably not made this discovery before he was two years old."<ref name=Wilson1923 /> [[Daniel Bernoulli]] (1778) introduced the principle of the maximum product of the probabilities of a system of concurrent errors. [[File:Bendixen - Carl Friedrich Gauß, 1828.jpg|thumb|140px|Carl Friedrich Gauss]] [[Adrien-Marie Legendre]] (1805) developed the [[method of least squares]], and introduced it in his ''Nouvelles méthodes pour la détermination des orbites des comètes'' (''New Methods for Determining the Orbits of Comets'').<ref>{{cite web|last1=Seneta|first1=Eugene William|title="Adrien-Marie Legendre" (version 9)|url=http://statprob.com/encyclopedia/AdrienMarieLegendre.html|website=StatProb: The Encyclopedia Sponsored by Statistics and Probability Societies|access-date=27 January 2016|url-status=dead|archive-url=https://web.archive.org/web/20160203070724/http://statprob.com/encyclopedia/AdrienMarieLegendre.html|archive-date=3 February 2016}}</ref> In ignorance of Legendre's contribution, an Irish-American writer, [[Robert Adrain]], editor of "The Analyst" (1808), first deduced the law of facility of error, <math display="block">\phi(x) = ce^{-h^2 x^2}</math> where <math>h</math> is a constant depending on precision of observation, and <math>c</math> is a scale factor ensuring that the area under the curve equals 1. He gave two proofs, the second being essentially the same as [[John Herschel]]'s (1850).{{citation needed|date=June 2012}} [[Carl Friedrich Gauss|Gauss]] gave the first proof that seems to have been known in Europe (the third after Adrain's) in 1809. Further proofs were given by Laplace (1810, 1812), Gauss (1823), [[James Ivory (mathematician)|James Ivory]] (1825, 1826), Hagen (1837), [[Friedrich Bessel]] (1838), [[William Fishburn Donkin|W.F. Donkin]] (1844, 1856), and [[Morgan Crofton]] (1870). Other contributors were [[Robert Leslie Ellis|Ellis]] (1844), [[Augustus De Morgan|De Morgan]] (1864), [[James Whitbread Lee Glaisher|Glaisher]] (1872), and [[Giovanni Schiaparelli]] (1875). [[Christian August Friedrich Peters|Peters]]'s (1856) formula{{clarify|date=June 2012}} for ''r'', the [[probable error]] of a single observation, is well known. In the nineteenth century, authors on the general theory included [[Laplace]], [[Sylvestre Lacroix]] (1816), Littrow (1833), [[Adolphe Quetelet]] (1853), [[Richard Dedekind]] (1860), Helmert (1872), [[Hermann Laurent]] (1873), Liagre, Didion and [[Karl Pearson]]. [[Augustus De Morgan]] and [[George Boole]] improved the exposition of the theory. In 1906, [[Andrey Markov]] introduced<ref>{{cite web|url = http://www.statslab.cam.ac.uk/~rrw1/markov/M.pdf|title = Markov Chains|first= Richard |last = Weber|website = Statistical Laboratory|publisher = University of Cambridge}}</ref> the notion of [[Markov chains]], which played an important role in [[stochastic process]]es theory and its applications. The modern theory of probability based on [[Measure (mathematics)|measure theory]] was developed by [[Andrey Kolmogorov]] in 1931.<ref>{{cite journal|last1=Vitanyi |first1= Paul M.B.|title=Andrei Nikolaevich Kolmogorov|journal=CWI Quarterly|date=1988|issue=1|pages=3–18|url=http://homepages.cwi.nl/~paulv/KOLMOGOROV.BIOGRAPHY.html|access-date=27 January 2016}}</ref> On the geometric side, contributors to ''The Educational Times'' included Miller, Crofton, McColl, Wolstenholme, Watson, and [[Artemas Martin]].<ref>{{Cite book|last=Wilcox, Rand R.|title=Understanding and applying basic statistical methods using R|date= 2016|isbn=978-1-119-06140-3|location=Hoboken, New Jersey|oclc=949759319}}</ref> See [[integral geometry]] for more information.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)