Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Occam's razor
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Mathematical === {{Main| Akaike information criterion}} One justification of Occam's razor is a direct result of basic [[probability theory]]. By definition, all assumptions introduce possibilities for error; if an assumption does not improve the accuracy of a theory, its only effect is to increase the probability that the overall theory is wrong. There have also been other attempts to derive Occam's razor from probability theory, including notable attempts made by [[Harold Jeffreys]] and [[Edwin Thompson Jaynes|E. T. Jaynes]]. The probabilistic (Bayesian) basis for Occam's razor is elaborated by [[David J. C. MacKay]] in chapter 28 of his book ''Information Theory, Inference, and Learning Algorithms'',<ref>{{Cite book |url=http://www.inference.phy.cam.ac.uk/itprnn/book.pdf |title=Information Theory, Inference, and Learning Algorithms |last=MacKay |first=David J. C. |year=2003 |bibcode=2003itil.book.....M |archive-url=https://web.archive.org/web/20120915043535/http://www.inference.phy.cam.ac.uk/itprnn/book.pdf |archive-date=15 September 2012 |url-status=live }}</ref> where he emphasizes that a prior bias in favor of simpler models is not required. [[William H. Jefferys]] and [[James Berger (statistician)|James O. Berger]] (1991) generalize and quantify the original formulation's "assumptions" concept as the degree to which a proposition is unnecessarily accommodating to possible observable data.<ref name="Jefferys">{{Cite journal |last1=Jefferys |first1=William H. |last2=Berger |first2=James O. |year=1991 |title=Ockham's Razor and Bayesian Statistics |url=http://quasar.as.utexas.edu/papers/ockham.pdf |url-status=live |journal=[[American Scientist]] |volume=80 |issue=1 |pages=64–72 |jstor=29774559 |archive-url=https://web.archive.org/web/20050304065538/http://quasar.as.utexas.edu/papers/ockham.pdf |archive-date=4 March 2005}} (preprint available as "Sharpening Occam's Razor on a Bayesian Strop").</ref> They state, "A hypothesis with fewer adjustable parameters will automatically have an enhanced posterior probability, due to the fact that the predictions it makes are sharp."<ref name="Jefferys" /> The use of "sharp" here is not only a tongue-in-cheek reference to the idea of a razor, but also indicates that such predictions are more [[Accuracy and precision|accurate]] than competing predictions. The model they propose balances the precision of a theory's predictions against their sharpness, preferring theories that sharply make correct predictions over theories that accommodate a wide range of other possible results. This, again, reflects the mathematical relationship between key concepts in [[Bayesian inference]] (namely [[marginal probability]], [[conditional probability]], and [[posterior probability]]). The [[bias–variance tradeoff]] is a framework that incorporates the Occam's razor principle in its balance between overfitting (associated with lower bias but higher variance) and underfitting (associated with lower variance but higher bias).<ref>{{Cite book |title=An Introduction to Statistical Learning|last1=James |first1=Gareth |last2=Witten |first2 = Daniela |last3 = Hastie |first3 = Trevor|last4 = Tibshirani |first4 = Robert |display-authors = 1| date=2013 |publisher=springer |isbn=9781461471370 |pages=105, 203–204}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)