Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Autoregressive integrated moving average
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Statistical model used in time series analysis}} {{Redirect-distinguish|ARIMA|Arimaa}} In [[time series analysis]] used in [[statistics]] and [[econometrics]], '''autoregressive integrated moving average''' ('''ARIMA''') and '''seasonal ARIMA''' ('''SARIMA''') [[Mathematical model|models]] are generalizations of the [[autoregressive moving average]] (ARMA) model to non-stationary series and periodic variation, respectively. All these models are fitted to [[time series]] in order to better understand it and predict future values. The purpose of these generalizations is to fit the data as well as possible. Specifically, ARMA assumes that the series is [[Stationary process|stationary]], that is, its expected value is constant in time. If instead the series has a trend (but a constant variance/[[autocovariance]]), the trend is removed by "differencing",<ref>For further information on Stationarity and Differencing see https://www.otexts.org/fpp/8/1</ref> leaving a stationary series. This operation generalizes ARMA and corresponds to the "[[Order of integration|integrated]]" part of ARIMA. Analogously, periodic variation is removed by "seasonal differencing".<ref name=":1">{{cite book |last1=Hyndman |first1=Rob J |contribution-url=https://www.otexts.org/fpp/8/9 |contribution=8.9 Seasonal ARIMA models |last2=Athanasopoulos |first2=George |title=Forecasting: principles and practice |publisher=oTexts |access-date=19 May 2015}}</ref> == Components == As in ARMA, the "autoregressive" ({{serif|AR}}) part of ARIMA indicates that the evolving variable of interest is [[linear regression|regressed]] on its prior values. The "moving average" ({{serif|MA}}) part indicates that the [[errors and residuals in statistics|regression error]] is a [[linear combination]] of error terms whose values occurred contemporaneously and at various times in the past.<ref>{{Cite book |last=Box |first=George E. P. |title=Time Series Analysis: Forecasting and Control |publisher=WILEY |year=2015 |isbn=978-1-118-67502-1}}</ref> The "integrated" ({{serif|I}}) part indicates that the data values have been replaced with the difference between each value and the previous value. According to [[Wold's decomposition theorem]]<ref>{{Cite book |last=Hamilton |first=James |title=Time Series Analysis |publisher=Princeton University Press |year=1994 |isbn=9780691042893}}</ref><ref name=":5">{{Cite book |last=Papoulis |first=Athanasios |title=Probability, Random Variables, and Stochastic processes |publisher=Tata McGraw-Hill Education |year=2002}}</ref><ref name=":4">{{Cite web |last=Triacca |first=Umberto |date=19 Feb 2021 |title=The Wold Decomposition Theorem |url=http://www.phdeconomics.sssup.it/documents/Lesson11.pdf |url-status=live |archive-url=https://web.archive.org/web/20160327172444/http://www.phdeconomics.sssup.it:80/documents/Lesson11.pdf |archive-date=2016-03-27}}</ref> the ARMA model is sufficient to describe a '''regular''' (a.k.a. purely nondeterministic<ref name=":4" />) [[wide-sense stationary]] time series. This motivates to make such a non-stationary time series stationary, e.g., by using differencing, before using ARMA.<ref name=":2">{{cite arXiv|last1=Wang|first1=Shixiong|last2=Li|first2=Chongshou|last3=Lim|first3=Andrew|date=2019-12-18|title=Why Are the ARIMA and SARIMA not Sufficient|class=stat.AP|eprint=1904.07632}}</ref> If the time series contains a '''predictable''' sub-process (a.k.a. pure sine or complex-valued exponential process<ref name=":5" />), the predictable component is treated as a non-zero-mean but periodic (i.e., seasonal) component in the ARIMA framework that it is eliminated by the seasonal differencing. ==Mathematical formulation== Non-seasonal ARIMA models are usually denoted ARIMA(''p'', ''d'', ''q'') where [[parameter]]s ''p'', ''d'', ''q'' are non-negative integers: ''p'' is the order (number of time lags) of the [[autoregressive model]], ''d'' is the degree of differencing (the number of times the data have had past values subtracted), and ''q'' is the order of the [[moving-average model]]. Seasonal ARIMA models are usually denoted ARIMA(''p'', ''d'', ''q'')(''P'', ''D'', ''Q'')<sub>''m''</sub>, where the uppercase ''P'', ''D'', ''Q'' are the autoregressive, differencing, and moving average terms for the seasonal part of the ARIMA model and ''m'' is the number of periods in each season.<ref>{{cite web |title=Notation for ARIMA Models |url=https://support.sas.com/documentation/cdl/en/etsug/63939/HTML/default/viewer.htm#etsug_tffordet_sect016.htm |access-date=19 May 2015 |website=Time Series Forecasting System |publisher=SAS Institute}}</ref><ref name=":1" /> When two of the parameters are 0, the model may be referred to based on the non-zero parameter, dropping "{{serif|AR}}", "{{serif|I}}" or "{{serif|MA}}" from the acronym. For example, {{tmath|\text{ARIMA} (1,0,0)}} is {{math|AR(1)}}, {{tmath|\text{ARIMA}(0,1,0)}} is {{math|I(1)}}, and {{tmath|\text{ARIMA}(0,0,1)}} is {{math|MA(1)}}. Given time series data ''X''<sub>''t''</sub> where ''t'' is an integer index and the ''X''<sub>''t''</sub> are real numbers, an <math>\text{ARMA}(p',q)</math> model is given by :<math>X_t-\alpha_1X_{t-1}- \dots -\alpha_{p'}X_{t-p'} = \varepsilon_t + \theta_1 \varepsilon_{t-1} + \cdots +\theta_q \varepsilon_{t-q},</math> or equivalently by :<math> \left( 1 - \sum_{i=1}^{p'} \alpha_i L^i \right) X_t = \left( 1 + \sum_{i=1}^q \theta_i L^i \right) \varepsilon_t \, </math> where <math>L</math> is the [[lag operator]], the <math>\alpha_i</math> are the parameters of the autoregressive part of the model, the <math>\theta_i</math> are the parameters of the moving average part and the <math>\varepsilon_t</math> are error terms. The error terms <math>\varepsilon_t</math> are generally assumed to be [[Independent and identically-distributed random variables|independent, identically distributed]] variables sampled from a [[normal distribution]] with zero mean. If the polynomial <math>\textstyle \left( 1 - \sum_{i=1}^{p'} \alpha_i L^i \right)</math> has a [[unit root]] (a factor <math>(1-L)</math>) of multiplicity ''d'', then it can be rewritten as: :<math> \left( 1 - \sum_{i=1}^{p'} \alpha_i L^i \right) = \left( 1 - \sum_{i=1}^{p'-d} \varphi_i L^i \right) \left( 1 - L \right)^d. </math> An ARIMA(''p'', ''d'', ''q'') process expresses this polynomial factorisation property with ''p'' = ''p'−d'', and is given by: :<math> \left( 1 - \sum_{i=1}^p \varphi_i L^i \right) (1-L)^d X_t = \left( 1 + \sum_{i=1}^q \theta_i L^i \right) \varepsilon_t \, </math> and so is special case of an ARMA(''p+d'', ''q'') process having the autoregressive polynomial with ''d'' unit roots. (This is why no process that is accurately described by an ARIMA model with ''d'' > 0 is [[wide-sense stationary]].) The above can be generalized as follows. :<math> \left( 1 - \sum_{i=1}^p \varphi_i L^i\right) (1-L)^d X_t = \delta + \left( 1 + \sum_{i=1}^q \theta_i L^i \right) \varepsilon_t . \, </math> This defines an ARIMA(''p'', ''d'', ''q'') process with '''drift''' <math> \frac{\delta}{1 - \sum \varphi_i} </math>. ==Other special forms== The explicit identification of the factorization of the autoregression polynomial into factors as above can be extended to other cases, firstly to apply to the moving average polynomial and secondly to include other special factors. For example, having a factor <math>( 1 - L^s)</math> in a model is one way of including a non-stationary seasonality of period ''s'' into the model; this factor has the effect of re-expressing the data as changes from ''s'' periods ago. Another example is the factor <math>\left( 1 -\sqrt{3} L + L^2 \right)</math>, which includes a (non-stationary) seasonality of period 2.{{clarify|date=January 2013}} The effect of the first type of factor is to allow each season's value to drift separately over time, whereas with the second type values for adjacent seasons move together.{{clarify|date=January 2013}} Identification and specification of appropriate factors in an ARIMA model can be an important step in modeling as it can allow a reduction in the overall number of parameters to be estimated while allowing the imposition on the model of types of behavior that logic and experience suggest should be there. ==Differencing== {{Redirect|Differencing||Difference (disambiguation){{!}}Difference}} A stationary time series's properties do not change. Specifically, for a [[wide-sense stationary]] time series, the mean and the variance/[[autocovariance]] are constant over time. '''Differencing''' in statistics is a transformation applied to a non-stationary time-series in order to make it [[trend stationary]] (i.e., stationary {{em|in the mean sense}}), by removing or subtracting the trend or non-constant mean. However, it does not affect the non-stationarity of the variance or [[autocovariance]]. Likewise, ''seasonal differencing'' or ''[[deseasonalization]]'' is applied to a time-series to remove the seasonal component. From the perspective of signal processing, especially the [[Fourier analysis|Fourier spectral analysis]] theory, the trend is a low-frequency part in the spectrum of a series, while the season is a periodic-frequency part. Therefore, differencing is a [[High-pass filter|high-pass]] (that is, low-stop) filter and the seasonal-differencing is a [[comb filter]] to suppress respectively the low-frequency trend and the periodic-frequency season in the spectrum domain (rather than directly in the time domain).<ref name=":2" /> To difference the data, we compute the difference between consecutive observations. Mathematically, this is shown as :<math> y_t'= y_t - y_{t-1} \, </math> It may be necessary to difference the data a second time to obtain a stationary time series, which is referred to as '''second-order differencing''': :<math> \begin{align} y_t^* & = y_t' - y_{t-1}' \\ & =(y_t - y_{t-1})-(y_{t-1} - y_{t-2}) \\ & =y_ t - 2y_{t-1} + y_{t-2} \end{align} </math> Seasonal differencing involves computing the difference between an observation and the corresponding observation in the previous season e.g a year. This is shown as: :<math> y_t'= y_t - y_{t-m} \quad \text{where } m=\text{duration of season}. </math> The differenced data are then used for the estimation of an [[Autoregressive–moving-average model|ARMA]] model. ==Examples== Some well-known special cases arise naturally or are mathematically equivalent to other popular forecasting models. For example: * ARIMA(0, 0, 0) models [[white noise]]. * An ARIMA(0, 1, 0) model is a [[random walk]]. * An ARIMA(0, 1, 2) model is a Damped Holt's model. * An ARIMA(0, 1, 1) model without constant is a [[Exponential smoothing#Basic (simple) exponential smoothing|basic exponential smoothing]] model.<ref name=":0">{{Cite web|url=http://people.duke.edu/~rnau/411arim.htm#arima010|title=Introduction to ARIMA models|website=people.duke.edu|access-date=2016-06-05}}</ref> * An ARIMA(0, 2, 2) model is given by <math>X_t = 2X_{t-1} - X_{t-2} +(\alpha + \beta - 2) \varepsilon_{t-1} + (1-\alpha)\varepsilon_{t-2} + \varepsilon_{t}</math> — which is equivalent to Holt's linear method with additive errors, or [[Exponential smoothing#Double exponential smoothing (Holt linear)|double exponential smoothing]].<ref name=":0" /> == Choosing the order== The order ''p'' and ''q'' can be determined using the sample [[autocorrelation function]] (ACF), [[partial autocorrelation function]] (PACF), and/or extended autocorrelation function (EACF) method.<ref name=":3">{{Cite web|last=Missouri State University|title=Model Specification, Time Series Analysis|url=http://people.missouristate.edu/songfengzheng/Teaching/MTH548/Time%20Series-ch06.pdf}}</ref> Other alternative methods include AIC, BIC, etc.<ref name=":3" /> To determine the order of a non-seasonal ARIMA model, a useful criterion is the [[Akaike information criterion|Akaike information criterion (AIC)]]. It is written as :<math> \text{AIC} = -2\log(L)+2(p+q+k), </math> where ''L ''is the likelihood of the data, ''p ''is the order of the autoregressive part and ''q ''is the order of the moving average part. The ''k'' represents the intercept of the ARIMA model. For AIC, if ''k'' = 1 then there is an intercept in the ARIMA model (''c ''≠ 0) and if ''k ''= 0 then there is no intercept in the ARIMA model (''c ''= 0). The corrected AIC for ARIMA models can be written as :<math>\text{AICc}= \text{AIC}+ \frac{2(p+q+k)(p+q+k+1)}{T-p-q-k-1}.</math> The [[Bayesian information criterion|Bayesian Information Criterion (BIC)]] can be written as :<math>\text{BIC}= \text{AIC}+((\log T)-2)(p+q+k).</math> The objective is to minimize the AIC, AICc or BIC values for a good model. The lower the value of one of these criteria for a range of models being investigated, the better the model will suit the data. The AIC and the BIC are used for two completely different purposes. While the AIC tries to approximate models towards the reality of the situation, the BIC attempts to find the perfect fit. The BIC approach is often criticized as there never is a perfect fit to real-life complex data; however, it is still a useful method for selection as it penalizes models more heavily for having more parameters than the AIC would. AICc can only be used to compare ARIMA models with the same orders of differencing. For ARIMAs with different orders of differencing, [[Root-mean-square deviation|RMSE]] can be used for model comparison. ==Estimation of coefficients== {{Empty section|date=March 2017}} ==Forecasts using ARIMA models== The ARIMA model can be viewed as a "cascade" of two models. The first is non-stationary: :<math> Y_t = (1-L)^d X_t </math> while the second is [[wide-sense stationary]]: :<math> \left( 1 - \sum_{i=1}^p \varphi_i L^i \right) Y_t = \left( 1 + \sum_{i=1}^q \theta_i L^i \right) \varepsilon_t \, . </math> Now forecasts can be made for the process <math>Y_t</math>, using a generalization of the method of [[Autoregressive model#n-step-ahead forecasting|autoregressive forecasting]]. ===Forecast intervals=== The forecast intervals ([[confidence interval]]s for forecasts) for ARIMA models are based on assumptions that the residuals are uncorrelated and normally distributed. If either of these assumptions does not hold, then the forecast intervals may be incorrect. For this reason, researchers plot the ACF and histogram of the residuals to check the assumptions before producing forecast intervals. 95% forecast interval: <math>\hat{y}_{T+h\,\mid\, T}\pm1.96\sqrt{v_{T+h\,\mid\, T}}</math>, where <math>v_{T+h\mid T}</math> is the variance of <math>y_{T+h} \mid y_1,\dots,y_T</math>. For <math>h=1</math>, <math>v_{T+h\,\mid\, T}=\hat{\sigma}^2</math> for all ARIMA models regardless of parameters and orders. For ARIMA(0,0,q), <math>y_t=e_t+\sum_{i=1}^q\theta_ie_{t-i}.</math> : <math>v_{T+h\,\mid\, T} = \hat{\sigma}^2 \left[1+\sum_{i=1}^{h-1}\theta_ie_{t-i}\right], \text{ for } h=2,3,\ldots </math>{{Citation needed|reason=why adding random variable e instead of its variance?|date=September 2020}} In general, forecast intervals from ARIMA models will increase as the forecast horizon increases. ==Variations and extensions== A number of variations on the ARIMA model are commonly employed. If multiple time series are used then the <math>X_t</math> can be thought of as vectors and a VARIMA model may be appropriate. Sometimes a seasonal effect is suspected in the model; in that case, it is generally considered better to use a SARIMA (seasonal ARIMA) model than to increase the order of the AR or MA parts of the model.<ref name="ARIMAKHORDHA2018">{{cite book |last1=Swain|display-authors=et al |first1=S |title=Recent Findings in Intelligent Computing Techniques |chapter=Development of an ARIMA Model for Monthly Rainfall Forecasting over Khordha District, Odisha, India |volume=708 |pages=325–331 |doi=10.1007/978-981-10-8636-6_34 |series=Advances in Intelligent Systems and Computing |year=2018 |isbn=978-981-10-8635-9 }}</ref> If the time-series is suspected to exhibit [[long-range dependence]], then the ''d'' parameter may be allowed to have non-integer values in an [[autoregressive fractionally integrated moving average]] model, which is also called a Fractional ARIMA (FARIMA or ARFIMA) model. ==Software implementations== Various packages that apply methodology like [[Box–Jenkins method|Box–Jenkins]] parameter optimization are available to find the right parameters for the ARIMA model. * [[EViews]]: has extensive ARIMA and SARIMA capabilities. * [[julia language|Julia]]: contains an ARIMA implementation in the TimeModels package<ref>[https://github.com/JuliaStats/TimeModels.jl TimeModels.jl] www.github.com</ref> * [[Mathematica]]: includes [http://reference.wolfram.com/mathematica/ref/ARIMAProcess.html ARIMAProcess] function. * [[MATLAB]]: the [http://www.mathworks.com/products/econometrics/ Econometrics Toolbox] includes [http://www.mathworks.com/help/econ/arimaclass.html ARIMA models] and [http://www.mathworks.com/help/econ/regarimaclass.html regression with ARIMA errors] * [[NCSS (statistical software)|NCSS]]: includes several procedures for <code>ARIMA</code> fitting and forecasting.<ref>[http://ncss.wpengine.netdna-cdn.com/wp-content/themes/ncss/pdf/Procedures/NCSS/ARIMA-Box-Jenkins.pdf ARIMA in NCSS],</ref><ref>[http://ncss.wpengine.netdna-cdn.com/wp-content/themes/ncss/pdf/Procedures/NCSS/Automatic_ARMA.pdf Automatic ARMA in NCSS],</ref><ref>[http://ncss.wpengine.netdna-cdn.com/wp-content/themes/ncss/pdf/Procedures/NCSS/Autocorrelations.pdf Autocorrelations and Partial Autocorrelations in NCSS]</ref> * [[Python (programming language)|Python]]: the [https://www.statsmodels.org/ "statsmodels"] package includes models for time series analysis – univariate time series analysis: AR, ARIMA – vector autoregressive models, VAR and structural VAR – descriptive statistics and process models for time series analysis. * [[R (programming language)|R]]: the standard R ''stats'' package includes an ''arima'' function, which is documented in [http://search.r-project.org/R/library/stats/html/arima.html "ARIMA Modelling of Time Series"]. Besides the {{tmath|\text{ARIMA}(p,d,q)}} part, the function also includes seasonal factors, an intercept term, and exogenous variables (''xreg'', called "external regressors"). The package [https://cran.r-project.org/web/packages/astsa/index.html astsa] has scripts such as ''sarima'' to estimate seasonal or nonseasonal models and ''sarima.sim'' to simulate from these models. The CRAN task view on [https://cran.r-project.org/web/views/TimeSeries.html Time Series] is the reference with many more links. The [https://cran.r-project.org/web/packages/forecast/index.html "forecast"] package in [[R (programming language)|R]] can automatically select an ARIMA model for a given time series with the {{code|auto.arima()}} function [that can often give questionable results][http://freerangestats.info/blog/2015/09/30/autoarima-success-rates] and can also simulate seasonal and non-seasonal ARIMA models with its {{code|simulate.Arima()}} function.<ref>{{cite book |last1=Hyndman |first1=Rob J |contribution-url=https://www.otexts.org/fpp/8/7 |contribution=8.7 ARIMA modelling in R |last2=Athanasopoulos |first2=George |title=Forecasting: principles and practice |publisher=oTexts |access-date=19 May 2015}}</ref> * [[Ruby (programming language)|Ruby]]: the [https://rubygems.org/gems/statsample-timeseries "statsample-timeseries"] gem is used for time series analysis, including ARIMA models and Kalman Filtering. * [[JavaScript]]: the [https://www.npmjs.com/package/arima "arima"] package includes models for time series analysis and forecasting (ARIMA, SARIMA, SARIMAX, AutoARIMA) * [[C (programming language)|C]]: the [https://github.com/rafat/ctsa/ "ctsa"] package includes ARIMA, SARIMA, SARIMAX, AutoARIMA and multiple methods for time series analysis. * [http://www.safetoolboxes.com SAFE TOOLBOXES]: includes [http://www.safetoolboxes.com/howto_fitarimamodel.html ARIMA modelling] and [http://www.safetoolboxes.com/howto_transformtimeseries.html regression with ARIMA errors]. * [[SAS (software)|SAS]]: includes extensive ARIMA processing in its Econometric and Time Series Analysis system: SAS/ETS. * IBM [[SPSS]]: includes ARIMA modeling in the Professional and Premium editions of its Statistics package as well as its Modeler package. The default Expert Modeler feature evaluates a range of seasonal and non-seasonal autoregressive (''p''), integrated (''d''), and moving average (''q'') settings and seven exponential smoothing models. The Expert Modeler can also transform the target time-series data into its square root or natural log. The user also has the option to restrict the Expert Modeler to ARIMA models, or to manually enter ARIMA nonseasonal and seasonal ''p'', ''d'', and ''q'' settings without Expert Modeler. Automatic outlier detection is available for seven types of outliers, and the detected outliers will be accommodated in the time-series model if this feature is selected. * [[SAP AG|SAP]]: the APO-FCS package<ref>{{cite web|title=Box Jenkins model|url=http://help.sap.com/saphelp_45b/helpdata/en/35/8a524b52060634e10000009b38f9b9/content.htm|publisher=SAP|access-date=8 March 2013}}</ref> in [[SAP ERP]] from [[SAP AG|SAP]] allows creation and fitting of ARIMA models using the Box–Jenkins methodology. * [[SQL Server Analysis Services]]: from [[Microsoft]] includes ARIMA as a Data Mining algorithm. * [[Stata]] includes ARIMA modelling (using its arima command) as of Stata 9. * [https://statsim.com/ StatSim]: includes ARIMA models in the [https://statsim.com/forecast/ Forecast] web app. *[[Teradata]] Vantage has the ARIMA function as part of its machine learning engine. * TOL (Time Oriented Language) is designed to model ARIMA models (including SARIMA, ARIMAX and DSARIMAX variants) [https://web.archive.org/web/20170327171617/https://www.tol-project.org/]. * [[Scala (programming language)|Scala]]: [https://github.com/sryza/spark-timeseries spark-timeseries] library contains ARIMA implementation for Scala, Java and Python. Implementation is designed to run on [[Apache Spark]]. * [[PostgreSQL]]/MadLib: [https://madlib.apache.org/docs/latest/group__grp__arima.html Time Series Analysis/ARIMA]. *[[X-12-ARIMA]]: from the [[United States Census Bureau|US Bureau of the Census]] ==See also== * [[Autocorrelation]] * [[Autoregressive moving average model|ARMA]] * [[Finite impulse response]] * [[Infinite impulse response]] * [[Partial autocorrelation]] * [[X-13ARIMA-SEATS]] ==References== {{More footnotes|date=May 2011}} {{reflist}} ==Further reading== * {{cite book |last1=Asteriou |first1=Dimitros |last2=Hall |first2=Stephen G. |title=Applied Econometrics |publisher=Palgrave MacMillan |year=2011 |edition=Second |isbn=978-0-230-27182-1 |chapter=ARIMA Models and the Box–Jenkins Methodology |pages=265–286 }} * {{cite book |last=Mills |first=Terence C. |year=1990 |title=Time Series Techniques for Economists |publisher=Cambridge University Press |isbn=978-0-521-34339-8 |url-access=registration |url=https://archive.org/details/timeseriestechni0000mill }} * {{cite book |last1=Percival |first1=Donald B. |first2=Andrew T. |last2=Walden |year=1993 |title=Spectral Analysis for Physical Applications |publisher=Cambridge University Press |isbn=978-0-521-35532-2 }} * Shumway R.H. and Stoffer, D.S. (2017). ''Time Series Analysis and Its Applications: With R Examples''. Springer. [https://link.springer.com/book/10.1007/978-3-319-52452-8 DOI: 10.1007/978-3-319-52452-8] * [https://www.datacamp.com/courses/arima-models-in-r ARIMA Models in R]. Become an expert in fitting ARIMA (autoregressive integrated moving average) models to time series data using R. ==External links== *[http://people.duke.edu/~rnau/411arim.htm Lecture notes on ARIMA models] by Robert Nau at [[Duke University]] {{Stochastic processes}} [[Category:Time series models]] [[de:ARMA-Modell#ARIMA]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Citation needed
(
edit
)
Template:Cite arXiv
(
edit
)
Template:Cite book
(
edit
)
Template:Cite web
(
edit
)
Template:Clarify
(
edit
)
Template:Code
(
edit
)
Template:Em
(
edit
)
Template:Empty section
(
edit
)
Template:Math
(
edit
)
Template:More footnotes
(
edit
)
Template:Redirect
(
edit
)
Template:Redirect-distinguish
(
edit
)
Template:Reflist
(
edit
)
Template:Serif
(
edit
)
Template:Short description
(
edit
)
Template:Stochastic processes
(
edit
)
Template:Tmath
(
edit
)