Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Bayesian inference
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Applications== ===Statistical data analysis=== See the separate Wikipedia entry on [[Bayesian statistics]], specifically the [[Bayesian statistics#Statistical modeling|statistical modeling]] section in that page. ===Computer applications=== Bayesian inference has applications in [[artificial intelligence]] and [[expert system]]s. Bayesian inference techniques have been a fundamental part of computerized [[pattern recognition]] techniques since the late 1950s.<ref>{{cite journal |last1=Fienberg |first1=Stephen E. |title=When did Bayesian inference become "Bayesian"? |journal=Bayesian Analysis |date=2006-03-01 |volume=1 |issue=1 |doi=10.1214/06-BA101|doi-access=free }}</ref> There is also an ever-growing connection between Bayesian methods and simulation-based [[Monte Carlo method|Monte Carlo]] techniques since complex models cannot be processed in closed form by a Bayesian analysis, while a [[graphical model]] structure ''may'' allow for efficient simulation algorithms like the [[Gibbs sampling]] and other [[Metropolis–Hastings algorithm]] schemes.<ref>{{cite book|author=Jim Albert|year=2009|title= Bayesian Computation with R, Second edition|publisher=Springer|location=New York, Dordrecht, etc.|isbn= 978-0-387-92297-3}}</ref> Recently{{when|date=September 2018}} Bayesian inference has gained popularity among the [[phylogenetics]] community for these reasons; a number of applications allow many demographic and evolutionary parameters to be estimated simultaneously. As applied to [[statistical classification]], Bayesian inference has been used to develop algorithms for identifying [[e-mail spam]]. Applications which make use of Bayesian inference for spam filtering include [[CRM114 (program)|CRM114]], [[DSPAM]], [[Bogofilter]], [[SpamAssassin]], [[SpamBayes]], [[Mozilla]], XEAMS, and others. Spam classification is treated in more detail in the article on the [[naïve Bayes classifier]]. [[Solomonoff's theory of inductive inference|Solomonoff's Inductive inference]] is the theory of prediction based on observations; for example, predicting the next symbol based upon a given series of symbols. The only assumption is that the environment follows some unknown but computable [[probability distribution]]. It is a formal inductive framework that combines two well-studied principles of inductive inference: Bayesian statistics and [[Occam's Razor]].<ref>{{cite journal |doi= 10.3390/e13061076 |arxiv=1105.5721 |bibcode=2011Entrp..13.1076R |title=A Philosophical Treatise of Universal Induction| journal=Entropy|volume=13 |issue=6|pages=1076–1136|year=2011|last1=Rathmanner|first1=Samuel|last2=Hutter|first2=Marcus| last3=Ormerod|first3=Thomas C|s2cid=2499910 |doi-access=free }}</ref>{{rs inline|date=September 2018}} Solomonoff's universal prior probability of any prefix ''p'' of a computable sequence ''x'' is the sum of the probabilities of all programs (for a universal computer) that compute something starting with ''p''. Given some ''p'' and any computable but unknown probability distribution from which ''x'' is sampled, the universal prior and Bayes' theorem can be used to predict the yet unseen parts of ''x'' in optimal fashion.<ref>{{Cite journal |bibcode = 2007arXiv0709.1516H |title = On Universal Prediction and Bayesian Confirmation|journal = Theoretical Computer Science |volume = 384 |issue = 2007|pages = 33–48|last1 = Hutter|first1 = Marcus|last2 = He|first2 = Yang-Hui|last3 = Ormerod|first3 = Thomas C|year = 2007|arxiv = 0709.1516|doi = 10.1016/j.tcs.2007.05.016 |s2cid = 1500830}}</ref><ref>{{Cite CiteSeerX |last1=Gács |first1=Peter |last2=Vitányi |first2=Paul M. B. |date=2 December 2010 |title=Raymond J. Solomonoff 1926-2009 |citeseerx=10.1.1.186.8268 }}</ref> ===Bioinformatics and healthcare applications=== Bayesian inference has been applied in different [[Bioinformatics]] applications, including differential gene expression analysis.<ref name=":edgr">Robinson, Mark D & McCarthy, Davis J & Smyth, Gordon K edgeR: a Bioconductor package for differential expression analysis of digital gene expression data, Bioinformatics.</ref> Bayesian inference is also used in a general cancer risk model, called [[Continuous Individualized Risk Index|CIRI]] (Continuous Individualized Risk Index), where serial measurements are incorporated to update a Bayesian model which is primarily built from prior knowledge.<ref>{{Cite web| url=https://ciri.stanford.edu/|title=CIRI|website=ciri.stanford.edu|access-date=2019-08-11}}</ref><ref>{{Cite journal| last1=Kurtz |first1=David M.|last2=Esfahani|first2=Mohammad S.|last3=Scherer|first3=Florian|last4=Soo|first4=Joanne| last5=Jin| first5=Michael C.|last6=Liu|first6=Chih Long|last7=Newman|first7=Aaron M.|last8=Dührsen|first8=Ulrich| last9=Hüttmann | first9=Andreas | date=2019-07-25|title=Dynamic Risk Profiling Using Serial Tumor Biomarkers for Personalized Outcome Prediction | journal=Cell|volume=178|issue=3|pages=699–713.e19|doi=10.1016/j.cell.2019.06.011|issn=1097-4172|pmid=31280963|pmc=7380118|doi-access=free}}</ref> ===In the courtroom=== {{Main|Jurimetrics#Bayesian analysis of evidence}} Bayesian inference can be used by jurors to coherently accumulate the evidence for and against a defendant, and to see whether, in totality, it meets their personal threshold for "[[beyond a reasonable doubt]]".<ref>Dawid, A. P. and Mortera, J. (1996) "Coherent Analysis of Forensic Identification Evidence". ''[[Journal of the Royal Statistical Society]]'', Series B, 58, 425–443.</ref><ref> Foreman, L. A.; Smith, A. F. M., and Evett, I. W. (1997). "Bayesian analysis of deoxyribonucleic acid profiling data in forensic identification applications (with discussion)". ''Journal of the Royal Statistical Society'', Series A, 160, 429–469.</ref><ref>Robertson, B. and Vignaux, G. A. (1995) ''Interpreting Evidence: Evaluating Forensic Science in the Courtroom''. John Wiley and Sons. Chichester. {{ISBN|978-0-471-96026-3}}.</ref> Bayes' theorem is applied successively to all evidence presented, with the posterior from one stage becoming the prior for the next. The benefit of a Bayesian approach is that it gives the juror an unbiased, rational mechanism for combining evidence. It may be appropriate to explain Bayes' theorem to jurors in [[Bayes' rule|odds form]], as [[betting odds]] are more widely understood than probabilities. Alternatively, a [[Gambling and information theory|logarithmic approach]], replacing multiplication with addition, might be easier for a jury to handle. [[Image:Ebits2c.png|thumb|right|Adding up evidence]] If the existence of the crime is not in doubt, only the identity of the culprit, it has been suggested that the prior should be uniform over the qualifying population.<ref>Dawid, A. P. (2001) [http://128.40.111.250/evidence/content/dawid-paper.pdf Bayes' Theorem and Weighing Evidence by Juries]. {{Webarchive|url=https://web.archive.org/web/20150701112146/http://128.40.111.250/evidence/content/dawid-paper.pdf |date=2015-07-01. }}</ref> For example, if 1,000 people could have committed the crime, the prior probability of guilt would be 1/1000. The use of Bayes' theorem by jurors is controversial. In the United Kingdom, a defence [[expert witness]] explained Bayes' theorem to the jury in ''[[Regina versus Denis John Adams|R v Adams]]''. The jury convicted, but the case went to appeal on the basis that no means of accumulating evidence had been provided for jurors who did not wish to use Bayes' theorem. The Court of Appeal upheld the conviction, but it also gave the opinion that "To introduce Bayes' Theorem, or any similar method, into a criminal trial plunges the jury into inappropriate and unnecessary realms of theory and complexity, deflecting them from their proper task." Gardner-Medwin<ref>Gardner-Medwin, A. (2005) "What Probability Should the Jury Address?". ''[[Significance (journal)|Significance]]'', 2 (1), March 2005.</ref> argues that the criterion on which a verdict in a criminal trial should be based is ''not'' the probability of guilt, but rather the ''probability of the evidence, given that the defendant is innocent'' (akin to a [[frequentist]] [[p-value]]). He argues that if the posterior probability of guilt is to be computed by Bayes' theorem, the prior probability of guilt must be known. This will depend on the incidence of the crime, which is an unusual piece of evidence to consider in a criminal trial. Consider the following three propositions: : ''A'' – the known facts and testimony could have arisen if the defendant is guilty. : ''B'' – the known facts and testimony could have arisen if the defendant is innocent. : ''C'' – the defendant is guilty. Gardner-Medwin argues that the jury should believe both ''A'' and not-''B'' in order to convict. ''A'' and not-''B'' implies the truth of ''C'', but the reverse is not true. It is possible that ''B'' and ''C'' are both true, but in this case he argues that a jury should acquit, even though they know that they will be letting some guilty people go free. See also [[Lindley's paradox]]. ===Bayesian epistemology=== [[Bayesian epistemology]] is a movement that advocates for Bayesian inference as a means of justifying the rules of inductive logic. [[Karl Popper]] and [[David Miller (philosopher)|David Miller]] have rejected the idea of Bayesian rationalism, i.e. using Bayes rule to make epistemological inferences:<ref>{{cite book|first =David |last =Miller|title = Critical Rationalism|url = https://books.google.com/books?id=bh_yCgAAQBAJ|isbn = 978-0-8126-9197-9|year = 1994|publisher = Open Court|location = Chicago}}</ref> It is prone to the same [[vicious circle]] as any other [[justificationism|justificationist]] epistemology, because it presupposes what it attempts to justify. According to this view, a rational interpretation of Bayesian inference would see it merely as a probabilistic version of [[falsifiability|falsification]], rejecting the belief, commonly held by Bayesians, that high likelihood achieved by a series of Bayesian updates would prove the hypothesis beyond any reasonable doubt, or even with likelihood greater than 0. ===Other=== * The [[scientific method]] is sometimes interpreted as an application of Bayesian inference. In this view, Bayes' rule guides (or should guide) the updating of probabilities about [[hypothesis|hypotheses]] conditional on new observations or [[experiment]]s.<ref>Howson & Urbach (2005), Jaynes (2003)</ref> The Bayesian inference has also been applied to treat [[stochastic scheduling]] problems with incomplete information by Cai et al. (2009).<ref name="Cai et al. 2009">{{cite journal| last1=Cai|first1=X.Q.|last2=Wu|first2=X.Y.|last3=Zhou|first3=X.|title=Stochastic scheduling subject to breakdown-repeat breakdowns with incomplete information|journal=Operations Research|date=2009|volume=57| issue=5|pages=1236–1249| doi=10.1287/opre.1080.0660}}</ref> * [[Bayesian search theory]] is used to search for lost objects. * [[Bayesian inference in phylogeny]] * [[Bayesian tool for methylation analysis]] * [[Bayesian approaches to brain function]] investigate the brain as a Bayesian mechanism. * Bayesian inference in ecological studies<ref>{{Cite journal|last1=Ogle|first1=Kiona|last2=Tucker|first2=Colin| last3=Cable| first3=Jessica M.|date=2014-01-01|title=Beyond simple linear mixing models: process-based isotope partitioning of ecological processes|journal=Ecological Applications|language=en|volume=24|issue=1| pages=181–195|doi=10.1890/1051-0761-24.1.181 | pmid=24640543 |bibcode=2014EcoAp..24..181O |issn=1939-5582}}</ref><ref>{{Cite journal|last1=Evaristo|first1=Jaivime|last2=McDonnell|first2=Jeffrey J.| last3=Scholl|first3=Martha A.|last4=Bruijnzeel|first4=L. Adrian|last5=Chun|first5=Kwok P.|date=2016-01-01|title=Insights into plant water uptake from xylem-water isotope measurements in two tropical catchments with contrasting moisture conditions| journal=Hydrological Processes|volume=30|issue=18|language=en|pages=3210–3227|doi=10.1002/hyp.10841|issn=1099-1085| bibcode=2016HyPr...30.3210E|s2cid=131588159 }}</ref> * Bayesian inference is used to estimate parameters in stochastic chemical kinetic models<ref>{{Cite journal|last1=Gupta| first1=Ankur|last2=Rawlings|first2=James B.|date=April 2014|title=Comparison of Parameter Estimation Methods in Stochastic Chemical Kinetic Models: Examples in Systems Biology|journal=AIChE Journal|volume=60|issue=4|pages=1253–1268| doi=10.1002/aic.14409| issn=0001-1541|pmc=4946376|pmid=27429455| bibcode=2014AIChE..60.1253G}}</ref> * Bayesian inference in [[econophysics]] for currency or prediction of trend changes in financial quotations<ref>{{Cite journal|last=Fornalski| first=K.W.|title=The Tadpole Bayesian Model for Detecting Trend Changes in Financial Quotations|journal=R&R Journal of Statistics and Mathematical Sciences|date=2016|volume=2|issue=1|pages=117–122|url=http://www.rroij.com/open-access/the-tadpole-bayesian-model-for-detecting-trend-changesin-financial-quotations-.pdf}}</ref><ref>{{Cite journal|last1=Schütz|first1=N.| last2=Holschneider| first2=M.|date=2011|title=Detection of trend changes in time series using Bayesian inference|journal=Physical Review E|volume=84|issue=2|page=021120| doi=10.1103/PhysRevE.84.021120|pmid=21928962| arxiv=1104.3448| bibcode=2011PhRvE..84b1120S | s2cid=11460968}}</ref> *[[Bayesian inference in marketing]] *[[Bayesian inference in motor learning]] * Bayesian inference is used in [[probabilistic numerics]] to solve numerical problems
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)