Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Consistent estimator
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Establishing consistency == The notion of asymptotic consistency is very close, almost synonymous to the notion of convergence in probability. As such, any theorem, lemma, or property which establishes convergence in probability may be used to prove the consistency. Many such tools exist: * In order to demonstrate consistency directly from the definition one can use the inequality {{sfn|Amemiya|1985|loc=equation (3.2.5)}} :: <math> \Pr\!\big[h(T_n-\theta)\geq\varepsilon\big] \leq \frac{\operatorname{E}\big[h(T_n-\theta)\big]}{h(\varepsilon)}, </math> the most common choice for function ''h'' being either the absolute value (in which case it is known as [[Markov inequality]]), or the quadratic function (respectively [[Chebyshev's inequality]]). * Another useful result is the [[continuous mapping theorem]]: if ''T<sub>n</sub>'' is consistent for ''θ'' and ''g''(·) is a real-valued function continuous at the point ''θ'', then ''g''(''T<sub>n</sub>'') will be consistent for ''g''(''θ''):{{sfn|Amemiya|1985|loc=Theorem 3.2.6}} :: <math> T_n\ \xrightarrow{p}\ \theta\ \quad\Rightarrow\quad g(T_n)\ \xrightarrow{p}\ g(\theta) </math> * [[Slutsky's theorem]] can be used to combine several different estimators, or an estimator with a non-random convergent sequence. If ''T<sub>n</sub>'' →<sup style="position:relative;top:-.2em;left:-1em;">''d''</sup>''α'', and ''S<sub>n</sub>'' →<sup style="position:relative;top:-.2em;left:-1em;">''p''</sup>''β'', then {{sfn|Amemiya|1985|loc=Theorem 3.2.7}} :: <math>\begin{align} & T_n + S_n \ \xrightarrow{d}\ \alpha+\beta, \\ & T_n S_n \ \xrightarrow{d}\ \alpha \beta, \\ & T_n / S_n \ \xrightarrow{d}\ \alpha/\beta, \text{ provided that }\beta\neq0 \end{align}</math> * If estimator ''T<sub>n</sub>'' is given by an explicit formula, then most likely the formula will employ sums of random variables, and then the [[law of large numbers]] can be used: for a sequence {''X<sub>n</sub>''} of random variables and under suitable conditions, :: <math>\frac{1}{n}\sum_{i=1}^n g(X_i) \ \xrightarrow{p}\ \operatorname{E}[\,g(X)\,]</math> * If estimator ''T<sub>n</sub>'' is defined implicitly, for example as a value that maximizes certain objective function (see [[extremum estimator]]), then a more complicated argument involving [[stochastic equicontinuity]] has to be used.{{sfn|Newey|McFadden|1994|loc=Chapter 2}}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)