Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Statistics
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=====Error===== Working from a [[null hypothesis]], two broad categories of error are recognized: * [[Type I and type II errors#Type I error|Type I errors]] where the null hypothesis is falsely rejected, giving a "false positive". * [[Type I and type II errors#Type II error|Type II errors]] where the null hypothesis fails to be rejected and an actual difference between populations is missed, giving a "false negative". [[Standard deviation]] refers to the extent to which individual observations in a sample differ from a central value, such as the sample or population mean, while [[Standard error (statistics)#Standard error of the mean|Standard error]] refers to an estimate of difference between sample mean and population mean. A [[Errors and residuals in statistics#Introduction|statistical error]] is the amount by which an observation differs from its [[expected value]]. A [[Errors and residuals in statistics#Introduction|residual]] is the amount an observation differs from the value the estimator of the expected value assumes on a given sample (also called prediction). [[Mean squared error]] is used for obtaining [[efficient estimators]], a widely used class of estimators. [[Root mean square error]] is simply the square root of mean squared error. [[File:Linear least squares(2).svg|thumb|right|A least squares fit: in red the points to be fitted, in blue the fitted line.]] Many statistical methods seek to minimize the [[residual sum of squares]], and these are called "[[least squares|methods of least squares]]" in contrast to [[Least absolute deviations]]. The latter gives equal weight to small and big errors, while the former gives more weight to large errors. Residual sum of squares is also [[Differentiable function|differentiable]], which provides a handy property for doing [[regression analysis|regression]]. Least squares applied to [[linear regression]] is called [[ordinary least squares]] method and least squares applied to [[nonlinear regression]] is called [[non-linear least squares]]. Also in a linear regression model the non deterministic part of the model is called error term, disturbance or more simply noise. Both linear regression and non-linear regression are addressed in [[polynomial least squares]], which also describes the variance in a prediction of the dependent variable (y axis) as a function of the independent variable (x axis) and the deviations (errors, noise, disturbances) from the estimated (fitted) curve. Measurement processes that generate statistical data are also subject to error. Many of these errors are classified as [[Random error|random]] (noise) or [[Systematic error|systematic]] ([[bias]]), but other types of errors (e.g., blunder, such as when an analyst reports incorrect units) can also be important. The presence of [[missing data]] or [[censoring (statistics)|censoring]] may result in [[bias (statistics)|biased estimates]] and specific techniques have been developed to address these problems.<ref>Rubin, Donald B.; Little, Roderick J.A., Statistical analysis with missing data, New York: Wiley 2002</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)