Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Estimator
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Efficiency=== {{Main|Efficiency (statistics)}} The efficiency of an estimator is used to estimate the quantity of interest in a "minimum error" manner. In reality, there is not an explicit best estimator; there can only be a better estimator. Whether the efficiency of an estimator is better or not is based on the choice of a particular [[loss function]], and it is reflected by two naturally desirable properties of estimators: to be unbiased <math>\operatorname{E}(\widehat{\theta}) - \theta=0</math> and have minimal [[mean squared error]] (MSE) <math>\operatorname{E}[(\widehat{\theta} - \theta )^2]</math>. These cannot in general both be satisfied simultaneously: a biased estimator may have a lower mean squared error than any unbiased estimator (see [[estimator bias]]). This equation relates the mean squared error with the estimator bias:<ref name=Dekker2005 /> : <math> \operatorname{E}[(\widehat{\theta} - \theta )^2]=(\operatorname{E}(\widehat{\theta}) - \theta)^2+\operatorname{Var}(\widehat\theta)\ </math> The first term represents the mean squared error; the second term represents the square of the estimator bias; and the third term represents the variance of the estimator. The quality of the estimator can be identified from the comparison between the variance, the square of the estimator bias, or the MSE. The variance of the good estimator (good efficiency) would be smaller than the variance of the bad estimator (bad efficiency). The square of an estimator bias with a good estimator would be smaller than the estimator bias with a bad estimator. The MSE of a good estimator would be smaller than the MSE of the bad estimator. Suppose there are two estimator, <math>\widehat\theta_1</math> is the good estimator and <math>\widehat\theta_2</math> is the bad estimator. The above relationship can be expressed by the following formulas. : <math>\operatorname{Var}(\widehat\theta_1)<\operatorname{Var}(\widehat\theta_2)</math> : <math>|\operatorname{E}(\widehat\theta_1) - \theta|<\left|\operatorname{E}(\widehat\theta_2) - \theta\right|</math> : <math>\operatorname{MSE}(\widehat\theta_1)<\operatorname{MSE}(\widehat\theta_2)</math> Besides using formula to identify the efficiency of the estimator, it can also be identified through the graph. If an estimator is efficient, in the frequency vs. value graph, there will be a curve with high frequency at the center and low frequency on the two sides. For example: [[File:Good estimator.jpg|center|thumb]] If an estimator is not efficient, the frequency vs. value graph, there will be a relatively more gentle curve. [[File:Bad estimator.jpg|center|thumb]] To put it simply, the good estimator has a narrow curve, while the bad estimator has a large curve. Plotting these two curves on one graph with a shared ''y''-axis, the difference becomes more obvious. [[File:The comparsion between a good and a bad estimator.jpg|center|thumb|Comparison between good and bad estimator.]] Among unbiased estimators, there often exists one with the lowest variance, called the minimum variance unbiased estimator ([[MVUE]]). In some cases an unbiased [[efficiency (statistics)|efficient estimator]] exists, which, in addition to having the lowest variance among unbiased estimators, satisfies the [[Cramér–Rao bound]], which is an absolute lower bound on variance for statistics of a variable. Concerning such "best unbiased estimators", see also [[Cramér–Rao bound]], [[Gauss–Markov theorem]], [[Lehmann–Scheffé theorem]], [[Rao–Blackwell theorem]].
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)