Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Linear model
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Type of statistical model}} {{Distinguish|linear model of innovation}} In [[statistics]], the term '''linear model''' refers to any model which assumes [[linearity]] in the system. The most common occurrence is in connection with regression models and the term is often taken as synonymous with [[linear regression]] model. However, the term is also used in [[time series analysis]] with a different meaning. In each case, the designation "linear" is used to identify a subclass of models for which substantial reduction in the complexity of the related [[statistical theory]] is possible. ==Linear regression models== {{main|Linear regression}} For the regression case, the [[statistical model]] is as follows. Given a (random) sample <math> (Y_i, X_{i1}, \ldots, X_{ip}), \, i = 1, \ldots, n </math> the relation between the observations <math>Y_i</math> and the [[independent variables]] <math>X_{ij}</math> is formulated as :<math>Y_i = \beta_0 + \beta_1 \phi_1(X_{i1}) + \cdots + \beta_p \phi_p(X_{ip}) + \varepsilon_i \qquad i = 1, \ldots, n </math> where <math> \phi_1, \ldots, \phi_p </math> may be [[Nonlinear system|nonlinear]] functions. In the above, the quantities <math>\varepsilon_i</math> are [[random variable]]s representing errors in the relationship. The "linear" part of the designation relates to the appearance of the [[regression coefficient]]s, <math>\beta_j</math> in a linear way in the above relationship. Alternatively, one may say that the predicted values corresponding to the above model, namely :<math>\hat{Y}_i = \beta_0 + \beta_1 \phi_1(X_{i1}) + \cdots + \beta_p \phi_p(X_{ip}) \qquad (i = 1, \ldots, n), </math> are linear functions of the <math>\beta_j</math>. Given that estimation is undertaken on the basis of a [[least squares]] analysis, estimates of the unknown parameters <math>\beta_j</math> are determined by minimising a sum of squares function :<math>S = \sum_{i = 1}^n \varepsilon_i^2 = \sum_{i = 1}^n \left(Y_i - \beta_0 - \beta_1 \phi_1(X_{i1}) - \cdots - \beta_p \phi_p(X_{ip})\right)^2 .</math> From this, it can readily be seen that the "linear" aspect of the model means the following: :*the function to be minimised is a quadratic function of the <math>\beta_j</math> for which minimisation is a relatively simple problem; :*the derivatives of the function are linear functions of the <math>\beta_j</math> making it easy to find the minimising values; :*the minimising values <math>\beta_j</math> are linear functions of the observations <math>Y_i</math>; :*the minimising values <math>\beta_j</math> are linear functions of the random errors <math>\varepsilon_i</math> which makes it relatively easy to determine the statistical properties of the estimated values of <math>\beta_j</math>. ==Time series models== An example of a linear time series model is an [[autoregressive moving average model]]. Here the model for values {<math>X_t</math>} in a time series can be written in the form :<math> X_t = c + \varepsilon_t + \sum_{i=1}^p \phi_i X_{t-i} + \sum_{i=1}^q \theta_i \varepsilon_{t-i}.\,</math> where again the quantities <math>\varepsilon_i</math> are random variables representing [[Innovation (signal processing)|innovations]] which are new random effects that appear at a certain time but also affect values of <math>X</math> at later times. In this instance the use of the term "linear model" refers to the structure of the above relationship in representing <math>X_t</math> as a linear function of past values of the same time series and of current and past values of the innovations.<ref>Priestley, M.B. (1988) ''Non-linear and Non-stationary time series analysis'', Academic Press. {{ISBN|0-12-564911-8}}</ref> This particular aspect of the structure means that it is relatively simple to derive relations for the mean and [[covariance]] properties of the time series. Note that here the "linear" part of the term "linear model" is not referring to the coefficients <math>\phi_i</math> and <math>\theta_i</math>, as it would be in the case of a regression model, which looks structurally similar. ==Other uses in statistics== There are some other instances where "nonlinear model" is used to contrast with a linearly structured model, although the term "linear model" is not usually applied. One example of this is [[nonlinear dimensionality reduction]]. ==See also== * [[General linear model]] * [[Generalized linear model]] * [[Linear predictor function]] * [[Linear system]] * [[Linear regression]] * [[Statistical model]] ==References== {{Reflist}} {{Statistics}} {{Authority control}} [[Category:Curve fitting]] [[Category:Regression models]] [[ar:نموذج الانحدار الخطي]] [[fr:Modèle linéaire]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Authority control
(
edit
)
Template:Distinguish
(
edit
)
Template:ISBN
(
edit
)
Template:Main
(
edit
)
Template:Reflist
(
edit
)
Template:Short description
(
edit
)
Template:Statistics
(
edit
)