Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Principle of maximum entropy
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Compatibility with Bayes' theorem=== Giffin and Caticha (2007) state that [[Bayes' theorem]] and the principle of maximum entropy are completely compatible and can be seen as special cases of the "method of maximum relative entropy". They state that this method reproduces every aspect of orthodox Bayesian inference methods. In addition this new method opens the door to tackling problems that could not be addressed by either the maximal entropy principle or orthodox Bayesian methods individually. Moreover, recent contributions (Lazar 2003, and Schennach 2005) show that frequentist relative-entropy-based inference approaches (such as [[empirical likelihood]] and [[exponentially tilted empirical likelihood]] β see e.g. Owen 2001 and Kitamura 2006) can be combined with prior information to perform Bayesian posterior analysis. Jaynes stated Bayes' theorem was a way to calculate a probability, while maximum entropy was a way to assign a prior probability distribution.<ref name=Jaynes1988/> It is however, possible in concept to solve for a posterior distribution directly from a stated prior distribution using the [[Cross-entropy|principle of minimum cross-entropy]] (or the Principle of Maximum Entropy being a special case of using a [[uniform distribution (discrete)|uniform distribution]] as the given prior), independently of any Bayesian considerations by treating the problem formally as a constrained optimisation problem, the Entropy functional being the objective function. For the case of given average values as testable information (averaged over the sought after probability distribution), the sought after distribution is formally the [[Gibbs measure|Gibbs (or Boltzmann) distribution]] the parameters of which must be solved for in order to achieve minimum cross entropy and satisfy the given testable information.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)