Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Stationary process
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Type of stochastic process}} In [[mathematics]] and [[statistics]], a '''stationary process''' (also called a '''strict/strictly stationary process''' or '''strong/strongly stationary process''') is a [[stochastic process]] whose statistical properties, such as [[mean]] and [[variance]], do not change over time. More formally, the [[joint probability distribution]] of the process remains the same when shifted in time. This implies that the process is statistically consistent across different time periods. Because many statistical procedures in [[time series analysis]] assume stationarity, non-stationary data are frequently transformed to achieve stationarity before analysis. A common cause of non-stationarity is a trend in the mean, which can be due to either a [[unit root]] or a deterministic trend. In the case of a unit root, stochastic shocks have permanent effects, and the process is not [[Mean-reverting process|mean-reverting]]. With a deterministic trend, the process is called [[Trend-stationary process|trend-stationary]], and shocks have only transitory effects, with the variable tending towards a deterministically evolving mean. A trend-stationary process is not strictly stationary but can be made stationary by removing the trend. Similarly, processes with unit roots can be made stationary through [[differencing]]. Another type of non-stationary process, distinct from those with trends, is a [[cyclostationary process]], which exhibits cyclical variations over time. Strict stationarity, as defined above, can be too restrictive for many applications. Therefore, other forms of stationarity, such as '''wide-sense stationarity''' or '''''N''-th-order stationarity''', are often used. The definitions for different kinds of stationarity are not consistent among different authors (see [[Stationary process#Other terminology|Other terminology]]). ==Strict-sense stationarity== ===Definition=== Formally, let <math>\left\{X_t\right\}</math> be a [[stochastic process]] and let <math>F_{X}(x_{t_1 + \tau}, \ldots, x_{t_n + \tau})</math> represent the [[cumulative distribution function]] of the [[marginal distribution|unconditional]] (i.e., with no reference to any particular starting value) [[joint distribution]] of <math>\left\{X_t\right\}</math> at times <math>t_1 + \tau, \ldots, t_n + \tau</math>. Then, <math>\left\{X_t\right\}</math> is said to be '''strictly stationary''', '''strongly stationary''' or '''strict-sense stationary''' if<ref name=KunIlPark>{{cite book | author=Park, Kun Il| title=Fundamentals of Probability and Stochastic Processes with Applications to Communications| publisher=Springer | year=2018 | isbn=978-3-319-68074-3}}</ref>{{rp|p. 155}} {{Equation box 1 |indent = |title= |equation = {{NumBlk||<math> F_{X}(x_{t_1+\tau} ,\ldots, x_{t_n+\tau}) = F_{X}(x_{t_1},\ldots, x_{t_n}) \quad \text{for all } \tau,t_1, \ldots, t_n \in \mathbb{R} \text{ and for all } n \in \mathbb{N}_{>0}</math>|{{EquationRef|Eq.1}}}} |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA}} Since <math>\tau</math> does not affect <math>F_X(\cdot)</math>, <math> F_{X}</math> is independent of time. ===Examples=== [[File:Stationarycomparison.png|thumb|right|390px|Two simulated time series processes, one stationary and the other non-stationary, are shown above. The [[Augmented Dickey-Fuller test|augmented Dickey–Fuller]] (ADF) [[test statistic]] is reported for each process; non-stationarity cannot be rejected for the second process at a 5% [[significance level]].]] [[White noise]] is the simplest example of a stationary process. An example of a [[Discrete-time stochastic process|discrete-time]] stationary process where the sample space is also discrete (so that the random variable may take one of ''N'' possible values) is a [[Bernoulli scheme]]. Other examples of a discrete-time stationary process with continuous sample space include some [[autoregressive]] and [[moving average model|moving average]] processes which are both subsets of the [[autoregressive moving average model]]. Models with a non-trivial autoregressive component may be either stationary or non-stationary, depending on the parameter values, and important non-stationary special cases are where [[unit root]]s exist in the model. ====Example 1==== Let <math>Y</math> be any scalar [[random variable]], and define a time-series <math>\left\{X_t\right\}</math>, by :<math>X_t=Y \qquad \text{ for all } t.</math> Then <math>\left\{X_t\right\}</math> is a stationary time series, for which realisations consist of a series of constant values, with a different constant value for each realisation. A [[law of large numbers]] does not apply on this case, as the limiting value of an average from a single realisation takes the random value determined by <math>Y</math>, rather than taking the [[expected value]] of <math>Y</math>. The time average of <math>X_t</math> does not converge since the process is not [[Ergodic process|ergodic]]. ====Example 2==== As a further example of a stationary process for which any single realisation has an apparently noise-free structure, let <math>Y</math> have a [[Uniform distribution (continuous)|uniform distribution]] on <math>[0,2\pi]</math> and define the time series <math>\left\{X_t\right\}</math> by :<math>X_t=\cos (t+Y) \quad \text{ for } t \in \mathbb{R}. </math> Then <math>\left\{X_t\right\}</math> is strictly stationary since (<math> (t+ Y) </math> modulo <math> 2 \pi </math>) follows the same uniform distribution as <math> Y </math> for any <math> t </math>. ==== Example 3 ==== Keep in mind that a [[white noise|weakly white noise]] is not necessarily strictly stationary. Let <math>\omega</math> be a random variable uniformly distributed in the interval <math>(0, 2\pi)</math> and define the time series <math>\left\{z_t\right\}</math> <math>z_t=\cos(t\omega) \quad (t=1,2,...) </math> Then :<math> \begin{align} \mathbb{E}(z_t) &= \frac{1}{2\pi} \int_0^{2\pi} \cos(t\omega) \,d\omega = 0,\\ \operatorname{Var}(z_t) &= \frac{1}{2\pi} \int_0^{2\pi} \cos^2(t\omega) \,d\omega = 1/2,\\ \operatorname{Cov}(z_t , z_j) &= \frac{1}{2\pi} \int_0^{2\pi} \cos(t\omega)\cos(j\omega) \,d\omega = 0 \quad \forall t\neq j. \end{align} </math> So <math>\{z_t\}</math> is a white noise in the weak sense (the mean and cross-covariances are zero, and the variances are all the same), however it is not strictly stationary. {{clear}} ==''N''th-order stationarity== In {{EquationNote|Eq.1}}, the distribution of <math>n</math> samples of the stochastic process must be equal to the distribution of the samples shifted in time ''for all'' <math>n</math>. ''N''-th-order stationarity is a weaker form of stationarity where this is only requested for all <math>n</math> up to a certain order <math>N</math>. A random process <math>\left\{X_t\right\}</math> is said to be '''''N''-th-order stationary''' if:<ref name=KunIlPark/>{{rp|p. 152}} {{Equation box 1 |indent = |title= |equation = {{NumBlk||<math> F_{X}(x_{t_1+\tau} ,\ldots, x_{t_n+\tau}) = F_{X}(x_{t_1},\ldots, x_{t_n}) \quad \text{for all } \tau,t_1, \ldots, t_n \in \mathbb{R} \text{ and for all } n \in \{1,\ldots,N\}</math>|{{EquationRef|Eq.2}}}} |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA}} ==Weak or wide-sense stationarity==<!-- Wide-sense stationary process redirects here --> ===Definition=== A weaker form of stationarity commonly employed in [[signal processing]] is known as '''weak-sense stationarity''', '''wide-sense stationarity (WSS)''', or '''covariance stationarity'''. WSS random processes only require that 1st [[moment (mathematics)|moment]] (i.e. the mean) and [[autocovariance]] do not vary with respect to time and that the 2nd moment is finite for all times. Any strictly stationary process which has a finite [[mean]] and [[covariance]] is also WSS.<ref name="Florescu2014">{{cite book|author=Ionut Florescu|title=Probability and Stochastic Processes|date=7 November 2014|publisher=John Wiley & Sons|isbn=978-1-118-59320-2}}</ref>{{rp|p. 299}} So, a [[continuous time]] [[random process]] <math>\left\{X_t\right\}</math> which is WSS has the following restrictions on its mean function <math>m_X(t) \triangleq \operatorname E[X_t]</math> and [[autocovariance]] function <math>K_{XX}(t_1, t_2) \triangleq \operatorname E[(X_{t_1}-m_X(t_1))(X_{t_2}-m_X(t_2))]</math>: {{Equation box 1 |indent = |title= |equation = {{NumBlk||<math> \begin{align} & m_X(t) = m_X(t + \tau) & & \text{for all } \tau,t \in \mathbb{R} \\ & K_{XX}(t_1, t_2) = K_{XX}(t_1 - t_2, 0) & & \text{for all } t_1,t_2 \in \mathbb{R} \\ & \operatorname E[|X_t|^2] < \infty & & \text{for all } t \in \mathbb{R} \end{align} </math>|{{EquationRef|Eq.3}}}} |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA}} The first property implies that the mean function <math>m_X(t)</math> must be constant. The second property implies that the autocovariance function depends only on the ''difference'' between <math>t_1</math> and <math>t_2</math> and only needs to be indexed by one variable rather than two variables.<ref name=KunIlPark/>{{rp|p. 159}} Thus, instead of writing, :<math>\,\!K_{XX}(t_1 - t_2, 0)\,</math> the notation is often abbreviated by the substitution <math>\tau = t_1 - t_2</math>: :<math>K_{XX}(\tau) \triangleq K_{XX}(t_1 - t_2, 0)</math> This also implies that the [[autocorrelation]] depends only on <math>\tau = t_1 - t_2</math>, that is :<math>\,\! R_X(t_1,t_2) = R_X(t_1-t_2,0) \triangleq R_X(\tau).</math> The third property says that the second moments must be finite for any time <math>t</math>. ===Motivation=== The main advantage of wide-sense stationarity is that it places the time-series in the context of [[Hilbert space]]s. Let ''H'' be the Hilbert space generated by {''x''(''t'')} (that is, the closure of the set of all linear combinations of these random variables in the Hilbert space of all square-integrable random variables on the given probability space). By the positive definiteness of the autocovariance function, it follows from [[Bochner's theorem]] that there exists a positive measure <math>\mu</math> on the real line such that ''H'' is isomorphic to the Hilbert subspace of ''L''<sup>2</sup>(''μ'') generated by {''e''<sup>−2{{pi}}''iξ⋅t''</sup>}. This then gives the following Fourier-type decomposition for a continuous time stationary stochastic process: there exists a stochastic process <math>\omega_\xi</math> with [[orthogonal increments]] such that, for all <math>t</math> :<math>X_t = \int e^{- 2 \pi i \lambda \cdot t} \, d \omega_\lambda,</math> where the integral on the right-hand side is interpreted in a suitable (Riemann) sense. The same result holds for a discrete-time stationary process, with the spectral measure now defined on the unit circle. When processing WSS random signals with [[linear]], [[time-invariant]] ([[LTI system theory|LTI]]) [[filter (signal processing)|filter]]s, it is helpful to think of the correlation function as a [[linear operator]]. Since it is a [[circulant matrix|circulant]] operator (depends only on the difference between the two arguments), its eigenfunctions are the [[Fourier series|Fourier]] complex exponentials. Additionally, since the [[eigenfunction]]s of LTI operators are also [[exponential function|complex exponential]]s, LTI processing of WSS random signals is highly tractable—all computations can be performed in the [[frequency domain]]. Thus, the WSS assumption is widely employed in signal processing [[algorithm]]s. ===Definition for complex stochastic process=== In the case where <math>\left\{X_t\right\}</math> is a complex stochastic process the [[autocovariance]] function is defined as <math>K_{XX}(t_1, t_2) = \operatorname E[(X_{t_1}-m_X(t_1))\overline{(X_{t_2}-m_X(t_2))}]</math> and, in addition to the requirements in {{EquationNote|Eq.3}}, it is required that the pseudo-autocovariance function <math>J_{XX}(t_1, t_2) = \operatorname E[(X_{t_1}-m_X(t_1))(X_{t_2}-m_X(t_2))]</math> depends only on the time lag. In formulas, <math>\left\{X_t\right\}</math> is WSS, if {{Equation box 1 |indent = |title= |equation = {{NumBlk||<math> \begin{align} & m_X(t) = m_X(t + \tau) & & \text{for all } \tau,t \in \mathbb{R} \\ & K_{XX}(t_1, t_2) = K_{XX}(t_1 - t_2, 0) & & \text{for all } t_1,t_2 \in \mathbb{R} \\ & J_{XX}(t_1, t_2) = J_{XX}(t_1 - t_2, 0) & & \text{for all } t_1,t_2 \in \mathbb{R} \\ & \operatorname E[|X(t)|^2] < \infty & & \text{for all } t \in \mathbb{R} \end{align} </math>|{{EquationRef|Eq.4}}}} |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA}} ==Joint stationarity== The concept of stationarity may be extended to two stochastic processes. ===Joint strict-sense stationarity=== Two stochastic processes <math>\left\{X_t\right\}</math> and <math>\left\{Y_t\right\}</math> are called '''jointly strict-sense stationary''' if their joint cumulative distribution <math>F_{XY}(x_{t_1} ,\ldots, x_{t_m},y_{t_1^'} ,\ldots, y_{t_n^'})</math> remains unchanged under time shifts, i.e. if {{Equation box 1 |indent = |title= |equation = {{NumBlk||<math> F_{XY}(x_{t_1} ,\ldots, x_{t_m},y_{t_1^'} ,\ldots, y_{t_n^'}) = F_{XY}(x_{t_1+\tau} ,\ldots, x_{t_m+\tau},y_{t_1^'+\tau} ,\ldots, y_{t_n^'+\tau}) \quad \text{for all } \tau,t_1, \ldots, t_m, t_1^', \ldots, t_n^' \in \mathbb{R} \text{ and for all } m,n \in \mathbb{N}</math>|{{EquationRef|Eq.5}}}} |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA}} ===Joint (''M'' + ''N'')th-order stationarity=== Two random processes <math>\left\{X_t\right\}</math> and <math>\left\{Y_t\right\}</math> is said to be '''jointly (''M'' + ''N'')-th-order stationary''' if:<ref name=KunIlPark/>{{rp|p. 159}} {{Equation box 1 |indent = |title= |equation = {{NumBlk||<math> F_{XY}(x_{t_1} ,\ldots, x_{t_m},y_{t_1^'} ,\ldots, y_{t_n^'}) = F_{XY}(x_{t_1+\tau} ,\ldots, x_{t_m+\tau},y_{t_1^'+\tau} ,\ldots, y_{t_n^'+\tau}) \quad \text{for all } \tau,t_1, \ldots, t_m, t_1^', \ldots, t_n^' \in \mathbb{R} \text{ and for all } m \in \{1,\ldots,M\}, n \in \{1,\ldots,N\}</math>|{{EquationRef|Eq.6}}}} |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA}} ===Joint weak or wide-sense stationarity=== Two stochastic processes <math>\left\{X_t\right\}</math> and <math>\left\{Y_t\right\}</math> are called '''jointly wide-sense stationary''' if they are both wide-sense stationary and their cross-covariance function <math>K_{XY}(t_1, t_2) = \operatorname E[(X_{t_1}-m_X(t_1))(Y_{t_2}-m_Y(t_2))]</math> depends only on the time difference <math>\tau = t_1 - t_2</math>. This may be summarized as follows: {{Equation box 1 |indent = |title= |equation = {{NumBlk||<math> \begin{align} & m_X(t) = m_X(t + \tau) & & \text{for all } \tau,t \in \mathbb{R} \\ & m_Y(t) = m_Y(t + \tau) & & \text{for all } \tau,t \in \mathbb{R} \\ & K_{XX}(t_1, t_2) = K_{XX}(t_1 - t_2, 0) & & \text{for all } t_1,t_2 \in \mathbb{R} \\ & K_{YY}(t_1, t_2) = K_{YY}(t_1 - t_2, 0) & & \text{for all } t_1,t_2 \in \mathbb{R} \\ & K_{XY}(t_1, t_2) = K_{XY}(t_1 - t_2, 0) & & \text{for all } t_1,t_2 \in \mathbb{R} \end{align} </math>|{{EquationRef|Eq.7}}}} |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA}} ==Relation between types of stationarity== * If a stochastic process is ''N''-th-order stationary, then it is also ''M''-th-order stationary for all {{tmath|M \le N}}. * If a stochastic process is second order stationary (<math>N=2</math>) and has finite second moments, then it is also wide-sense stationary.<ref name=KunIlPark/>{{rp|p. 159}} * If a stochastic process is wide-sense stationary, it is not necessarily second-order stationary.<ref name=KunIlPark/>{{rp|p. 159}} * If a stochastic process is strict-sense stationary and has finite second moments, it is wide-sense stationary.<ref name="Florescu2014"/>{{rp|p. 299}} * If two stochastic processes are jointly (''M'' + ''N'')-th-order stationary, this does not guarantee that the individual processes are ''M''-th- respectively ''N''-th-order stationary.<ref name=KunIlPark/>{{rp|p. 159}} ==Other terminology== The terminology used for types of stationarity other than strict stationarity can be rather mixed. Some examples follow. *[[Maurice Priestley|Priestley]] uses '''stationary up to order''' ''m'' if conditions similar to those given here for wide sense stationarity apply relating to moments up to order ''m''.<ref>{{cite book |last=Priestley |first=M. B. |year=1981 |title=Spectral Analysis and Time Series |publisher=Academic Press |isbn=0-12-564922-3 }}</ref><ref>{{cite book |last=Priestley |first=M. B. |year=1988 |title=Non-linear and Non-stationary Time Series Analysis |url=https://archive.org/details/nonlinearnonstat0000prie |url-access=registration |publisher=Academic Press |isbn=0-12-564911-8 }}</ref> Thus wide sense stationarity would be equivalent to "stationary to order 2", which is different from the definition of second-order stationarity given here. * [[Mehrdad Honarkhah|Honarkhah]] and [[Jef Caers|Caers]] also use the assumption of stationarity in the context of multiple-point geostatistics, where higher n-point statistics are assumed to be stationary in the spatial domain.<ref>{{cite journal |last1=Honarkhah |first1=M. |last2=Caers |first2=J. |year=2010 |doi=10.1007/s11004-010-9276-7 |title=Stochastic Simulation of Patterns Using Distance-Based Pattern Modeling |journal=Mathematical Geosciences |volume=42 |issue=5 |pages=487–517 |bibcode=2010MatGe..42..487H }}</ref> == Differencing == One way to make some time series stationary is to compute the differences between consecutive observations. This is known as [[unit root|differencing]]. Differencing can help stabilize the mean of a time series by removing changes in the level of a time series, and so eliminating trends. This can also remove seasonality, if differences are taken appropriately (e.g. differencing observations 1 year apart to remove a yearly trend). Transformations such as logarithms can help to stabilize the variance of a time series. One of the ways for identifying non-stationary times series is the [[Autocorrelation|ACF]] plot. Sometimes, patterns will be more visible in the ACF plot than in the original time series; however, this is not always the case.<ref>{{Cite book|url=https://www.otexts.org/fpp/8/1|chapter=8.1 Stationarity and differencing |title=Forecasting: Principles and Practice |edition=2nd |first1=Rob J. |last1=Hyndman |first2=George |last2=Athanasopoulos |publisher=OTexts |access-date=2016-05-18}}</ref> Another approach to identifying non-stationarity is to look at the [[Laplace transform]] of a series, which will identify both exponential trends and sinusoidal seasonality (complex exponential trends). Related techniques from [[signal analysis]] such as the [[wavelet transform]] and [[Fourier transform]] may also be helpful. ==See also== * [[Lévy process]] * [[Stationary ergodic process]] * [[Wiener–Khinchin theorem]] * [[Ergodicity]] * [[Statistical regularity]] * [[Autocorrelation]] * [[Whittle likelihood]] ==References== {{Reflist}} ==Further reading== * {{cite book |last=Enders |first=Walter |title=Applied Econometric Time Series |location=New York |publisher=Wiley |year=2010 |edition=Third |isbn=978-0-470-50539-7 |pages=53–57 }} * {{cite journal |last1=Jestrovic |first1=I. |last2=Coyle |first2=J. L. |last3=Sejdic |first3=E |year=2015 |doi=10.1016/j.brainres.2014.09.035 |title=The effects of increased fluid viscosity on stationary characteristics of EEG signal in healthy adults |journal=Brain Research |volume=1589 |pages=45–53 |pmid=25245522 |pmc=4253861}} * Hyndman, Athanasopoulos (2013). Forecasting: Principles and Practice. Otexts. https://www.otexts.org/fpp/8/1 ==External links== * [https://encyclopediaofmath.org/wiki/Spectral_decomposition_of_a_random_function Spectral decomposition of a random function (Springer)] {{Stochastic processes}} {{Statistics|analysis}} [[Category:Stochastic processes]] [[Category:Signal processing]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Cite book
(
edit
)
Template:Cite journal
(
edit
)
Template:Clear
(
edit
)
Template:EquationNote
(
edit
)
Template:Equation box 1
(
edit
)
Template:Pi
(
edit
)
Template:Reflist
(
edit
)
Template:Rp
(
edit
)
Template:Short description
(
edit
)
Template:Statistics
(
edit
)
Template:Stochastic processes
(
edit
)
Template:Tmath
(
edit
)