Lehmann–Scheffé theorem
Template:Short description Template:Refimprove In statistics, the Lehmann–Scheffé theorem is a prominent statement, tying together the ideas of completeness, sufficiency, uniqueness, and best unbiased estimation.<ref name=Casella/> The theorem states that any estimator that is unbiased for a given unknown quantity and that depends on the data only through a complete, sufficient statistic is the unique best unbiased estimator of that quantity. The Lehmann–Scheffé theorem is named after Erich Leo Lehmann and Henry Scheffé, given their two early papers.<ref name=LS1/><ref name=LS2/>
If <math> T </math> is a complete sufficient statistic for <math> \theta </math> and <math>\operatorname{E}[g(T)]=\tau(\theta) </math> then <math>g(T)</math> is the uniformly minimum-variance unbiased estimator (UMVUE) of <math>\tau(\theta)</math>.
StatementEdit
Let <math>\vec{X}= X_1, X_2, \dots, X_n</math> be a random sample from a distribution that has p.d.f (or p.m.f in the discrete case) <math>f(x:\theta)</math> where <math>\theta \in \Omega</math> is a parameter in the parameter space. Suppose <math>Y = u(\vec{X})</math> is a sufficient statistic for θ, and let <math>\{ f_Y(y:\theta): \theta \in \Omega\}</math> be a complete family. If <math>\varphi:\operatorname{E}[\varphi(Y)] = \theta</math> then <math>\varphi(Y)</math> is the unique MVUE of θ.
ProofEdit
By the Rao–Blackwell theorem, if <math>Z</math> is an unbiased estimator of θ then <math>\varphi(Y):= \operatorname{E}[Z\mid Y]</math> defines an unbiased estimator of θ with the property that its variance is not greater than that of <math>Z</math>.
Now we show that this function is unique. Suppose <math>W</math> is another candidate MVUE estimator of θ. Then again <math>\psi(Y):= \operatorname{E}[W\mid Y]</math> defines an unbiased estimator of θ with the property that its variance is not greater than that of <math>W</math>. Then
- <math>
\operatorname{E}[\varphi(Y) - \psi(Y)] = 0, \theta \in \Omega. </math>
Since <math>\{ f_Y(y:\theta): \theta \in \Omega\}</math> is a complete family
- <math>
\operatorname{E}[\varphi(Y) - \psi(Y)] = 0 \implies \varphi(y) - \psi(y) = 0, \theta \in \Omega </math>
and therefore the function <math>\varphi</math> is the unique function of Y with variance not greater than that of any other unbiased estimator. We conclude that <math>\varphi(Y)</math> is the MVUE.
Example for when using a non-complete minimal sufficient statisticEdit
An example of an improvable Rao–Blackwell improvement, when using a minimal sufficient statistic that is not complete, was provided by Galili and Meilijson in 2016.<ref>Template:Cite journal</ref> Let <math>X_1, \ldots, X_n</math> be a random sample from a scale-uniform distribution <math>X \sim U ( (1-k) \theta, (1+k) \theta),</math> with unknown mean <math>\operatorname{E}[X]=\theta</math> and known design parameter <math>k \in (0,1)</math>. In the search for "best" possible unbiased estimators for <math>\theta</math>, it is natural to consider <math>X_1</math> as an initial (crude) unbiased estimator for <math>\theta</math> and then try to improve it. Since <math>X_1</math> is not a function of <math>T = \left( X_{(1)}, X_{(n)} \right)</math>, the minimal sufficient statistic for <math>\theta</math> (where <math>X_{(1)} = \min_i X_i </math> and <math>X_{(n)} = \max_i X_i </math>), it may be improved using the Rao–Blackwell theorem as follows:
- <math>\hat{\theta}_{RB} =\operatorname{E}_\theta[X_1\mid X_{(1)}, X_{( n)}] = \frac{X_{(1)}+X_{(n)}} 2.</math>
However, the following unbiased estimator can be shown to have lower variance:
- <math>\hat{\theta}_{LV} = \frac 1 {k^2\frac{n-1}{n+1}+1} \cdot \frac{(1-k)X_{(1)} + (1+k) X_{(n)}} 2.</math>
And in fact, it could be even further improved when using the following estimator:
- <math>\hat{\theta}_\text{BAYES}=\frac{n+1} n \left[1- \frac{\frac{X_{(1)} (1+k)}{X_{(n)} (1-k)}-1}{ \left (\frac{X_{(1)} (1+k)}{X_{(n)} (1-k)}\right )^{n+1} -1} \right] \frac{X_{(n)}}{1+k}</math>
The model is a scale model. Optimal equivariant estimators can then be derived for loss functions that are invariant.<ref>Template:Cite journal</ref>