Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Sufficient statistic
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Background== Roughly, given a set <math> \mathbf{X}</math> of [[independent identically distributed]] data conditioned on an unknown parameter <math>\theta</math>, a sufficient statistic is a function <math>T(\mathbf{X})</math> whose value contains all the information needed to compute any estimate of the parameter (e.g. a [[maximum likelihood]] estimate). Due to the factorization theorem ([[#Fisher–Neyman factorization theorem|see below]]), for a sufficient statistic <math>T(\mathbf{X})</math>, the probability density can be written as <math>f_{\mathbf{X}}(x;\theta) = h(x) \, g(\theta, T(x))</math>. From this factorization, it can easily be seen that the maximum likelihood estimate of <math>\theta</math> will interact with <math>\mathbf{X}</math> only through <math>T(\mathbf{X})</math>. Typically, the sufficient statistic is a simple function of the data, e.g. the sum of all the data points. More generally, the "unknown parameter" may represent a [[Euclidean vector|vector]] of unknown quantities or may represent everything about the model that is unknown or not fully specified. In such a case, the sufficient statistic may be a set of functions, called a ''jointly sufficient statistic''. Typically, there are as many functions as there are parameters. For example, for a [[Gaussian distribution]] with unknown [[mean]] and [[variance]], the jointly sufficient statistic, from which maximum likelihood estimates of both parameters can be estimated, consists of two functions, the sum of all data points and the sum of all squared data points (or equivalently, the [[sample mean]] and [[sample variance]]). In other words, '''the [[joint probability distribution]] of the data is conditionally independent of the parameter given the value of the sufficient statistic for the parameter'''. Both the statistic and the underlying parameter can be vectors.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)