Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Nuisance parameter
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Statistical parameter needed for a model but not of primary interest}} In [[statistics]], a '''nuisance parameter''' is any [[parameter]] which is unspecified<ref>{{Cite book |last1=Wackerly |first1=Dennis |url=https://books.google.com/books?id=lTgGAAAAQBAJ&dq=wackerly+math+stats&pg=PP1 |title=Mathematical Statistics with Applications |last2=Mendenhall |first2=William |last3=Scheaffer |first3=Richard L. |date=2014-10-27 |publisher=Cengage Learning |isbn=978-1-111-79878-9 |language=en}}</ref> but which must be accounted for in the hypothesis testing of the parameters which are of interest. The classic example of a nuisance parameter comes from the [[normal distribution]], a member of the [[location–scale family]]. For at least one normal distribution, the [[variance]](s), ''σ<sup>2</sup>'' is often not specified or known, but one desires to hypothesis test on the mean(s). Another example might be [[linear regression]] with unknown variance in the [[explanatory variable]] (the independent variable): its variance is a nuisance parameter that must be accounted for to derive an accurate [[interval estimate]] of the [[regression slope]], calculate [[p-values]], hypothesis test on the slope's value; see [[regression dilution]]. Nuisance parameters are often [[scale parameter|scale parameters]], but not always; for example in [[errors-in-variables models]], the unknown true location of each observation is a nuisance parameter. A parameter may also cease to be a "nuisance" if it becomes the object of study, is estimated from data, or known. ==Theoretical statistics== The general treatment of nuisance parameters can be broadly similar between frequentist and Bayesian approaches to theoretical statistics. It relies on an attempt to partition the [[likelihood function]] into components representing information about the parameters of interest and information about the other (nuisance) parameters. This can involve ideas about [[Sufficiency (statistics)|sufficient statistics]] and [[ancillary statistic]]s. When this partition can be achieved it may be possible to complete a Bayesian analysis for the parameters of interest by determining their joint posterior distribution algebraically. The partition allows frequentist theory to develop general estimation approaches in the presence of nuisance parameters. If the partition cannot be achieved it may still be possible to make use of an approximate partition. In some special cases, it is possible to formulate methods that circumvent the presences of nuisance parameters. The [[t-test]] provides a practically useful test because the test statistic does not depend on the unknown variance but only the sample variance. It is a case where use can be made of a [[pivotal quantity]]. However, in other cases no such circumvention is known. ==Practical statistics== Practical approaches to statistical analysis treat nuisance parameters somewhat differently in frequentist and Bayesian methodologies. A general approach in a frequentist analysis can be based on maximum [[likelihood-ratio test]]s. These provide both [[significance test]]s and [[confidence interval]]s for the parameters of interest which are approximately valid for moderate to large sample sizes and which take account of the presence of nuisance parameters. See [[Debabrata Basu|Basu]] (1977) for some general discussion and Spall and Garner (1990) for some discussion relative to the identification of parameters in linear dynamic (i.e., [[state space representation]]) models. In [[Bayesian analysis]], a generally applicable approach creates random samples from the joint posterior distribution of all the parameters: see [[Markov chain Monte Carlo]]. Given these, the joint distribution of only the parameters of interest can be readily found by [[marginalization (probability)|marginalizing]] over the nuisance parameters. However, this approach may not always be computationally efficient if some or all of the nuisance parameters can be eliminated on a theoretical basis. ==See also== * [[Adaptive estimator]] * [[Likelihood function#Profile likelihood|Profile likelihood]] ==References== {{Reflist}} * Basu, D. (1977), "On the Elimination of Nuisance Parameters," ''Journal of the American Statistical Association'', vol. 77, pp. 355–366. {{doi|10.1080/01621459.1977.10481002}} * Bernardo, J. M., Smith, A. F. M. (2000) ''Bayesian Theory''. Wiley. {{isbn|0-471-49464-X}} * Cox, D.R., Hinkley, D.V. (1974) ''Theoretical Statistics''. Chapman and Hall. {{isbn|0-412-12420-3}} * Spall, J. C. and Garner, J. P. (1990), “Parameter Identification for State-Space Models with Nuisance Parameters,” ''IEEE Transactions on Aerospace and Electronic Systems'', vol. 26(6), pp. 992–998. * Young, G. A., Smith, R. L. (2005) ''Essentials of Statistical Inference'', CUP. {{isbn|0-521-83971-8}} [[Category:Estimation theory]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Cite book
(
edit
)
Template:Doi
(
edit
)
Template:Isbn
(
edit
)
Template:Reflist
(
edit
)
Template:Short description
(
edit
)