Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Principle of maximum entropy
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Principle in Bayesian statistics}} {{other uses of|Maximum entropy}} {{more footnotes|date=September 2008}} {{Bayesian statistics}} The '''principle of maximum entropy''' states that the [[probability distribution]] which best represents the current state of knowledge about a system is the one with largest [[Entropy (information theory)|entropy]], in the context of precisely stated prior data (such as a [[proposition]] that expresses [[#Testable information|testable information]]). Another way of stating this: Take precisely stated prior data or testable information about a probability distribution function. Consider the set of all trial probability distributions that would encode the prior data. According to this principle, the distribution with maximal [[information entropy]] is the best choice.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)