Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Bernoulli distribution
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Entropy=== Entropy is a measure of uncertainty or randomness in a probability distribution. For a Bernoulli random variable <math>X</math> with success probability <math>p</math> and failure probability <math>q = 1 - p</math>, the entropy <math>H(X)</math> is defined as: :<math>\begin{align} H(X) &= \mathbb{E}_p \ln (\frac{1}{P(X)}) = - [P(X = 0) \ln P(X = 0) + P(X = 1) \ln P(X = 1)] \\ H(X) &= - (q \ln q + p \ln p) , \quad q = P(X = 0), p = P(X = 1) \end{align}</math> The entropy is maximized when <math>p = 0.5</math>, indicating the highest level of uncertainty when both outcomes are equally likely. The entropy is zero when <math>p = 0</math> or <math>p = 1</math>, where one outcome is certain.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)