Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Gaussian process
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Stationary case=== For a stationary Gaussian process <math>X=(X_t)_{t\in\R},</math> some conditions on its spectrum are sufficient for sample continuity, but fail to be necessary. A necessary and sufficient condition, sometimes called Dudley–Fernique theorem, involves the function <math>\sigma</math> defined by <math display="block"> \sigma(h) = \sqrt{ {\mathbb E} \big[ X(t+h) - X(t) \big]^2 } </math> (the right-hand side does not depend on <math>t</math> due to stationarity). Continuity of <math>X</math> in probability is equivalent to continuity of <math>\sigma</math> at <math>0.</math> When convergence of <math>\sigma(h)</math> to <math>0</math> (as <math>h\to 0</math>) is too slow, sample continuity of <math>X</math> may fail. Convergence of the following integrals matters: <math display="block"> I(\sigma) = \int_0^1 \frac{ \sigma(h) }{ h \sqrt{ \log(1/h) } } \, dh = \int_0^\infty 2\sigma( e^{-x^2}) \, dx ,</math> these two integrals being equal according to [[integration by substitution]] <math display="inline"> h = e^{-x^2}, </math> <math display="inline">x = \sqrt{\log(1/h)} .</math> The first integrand need not be bounded as <math>h\to 0+,</math> thus the integral may converge (<math>I(\sigma)<\infty</math>) or diverge (<math>I(\sigma)=\infty</math>). Taking for example <math display="inline">\sigma( e^{-x^2}) = \tfrac{1}{x^a}</math> for large <math>x,</math> that is, <math display="inline">\sigma(h) = (\log(1/h))^{-a/2}</math> for small <math>h,</math> one obtains <math>I(\sigma)<\infty</math> when <math>a>1,</math> and <math>I(\sigma)=\infty</math> when <math>0 < a\le 1.</math> In these two cases the function <math>\sigma</math> is increasing on <math>[0,\infty),</math> but generally it is not. Moreover, the condition {{block indent | em = 1.5 | text ={{vanchor|(*)}} there exists <math>\varepsilon > 0</math> such that <math>\sigma</math> is monotone on <math>[0,\varepsilon]</math>}} does not follow from continuity of <math>\sigma</math> and the evident relations <math>\sigma(h) \ge 0</math> (for all <math>h</math>) and <math>\sigma(0) = 0.</math> {{math theorem | name = Theorem 1 | math_statement = Let <math>\sigma</math> be continuous and satisfy [[#(*)|(*)]]. Then the condition <math>I(\sigma) < \infty</math> is necessary and sufficient for sample continuity of <math>X.</math>}} Some history.<ref name=MarcusShepp72 />{{rp|424}} Sufficiency was announced by [[Xavier Fernique]] in 1964, but the first proof was published by [[Richard M. Dudley]] in 1967.<ref name=Dudley67 />{{rp|Theorem 7.1}} Necessity was proved by Michael B. Marcus and [[Lawrence Shepp]] in 1970.<ref name=MarcusShepp70>{{cite journal |first1=Michael B. |last1=Marcus |first2=Lawrence A. |last2=Shepp |author-link2=Lawrence Shepp |title=Continuity of Gaussian processes |journal=[[Transactions of the American Mathematical Society]] |volume=151 |issue= 2|pages=377–391 |year=1970 |doi=10.1090/s0002-9947-1970-0264749-1 |jstor=1995502 |doi-access=free }}</ref>{{rp|380}} There exist sample continuous processes <math>X</math> such that <math>I(\sigma)=\infty;</math> they violate condition [[#(*)|(*)]]. An example found by Marcus and Shepp <ref name=MarcusShepp70 />{{rp|387}} is a random [[Lacunary function#Lacunary trigonometric series|lacunary Fourier series]] <math display="block"> X_t = \sum_{n=1}^\infty c_n ( \xi_n \cos \lambda_n t + \eta_n \sin \lambda_n t ) ,</math> where <math>\xi_1,\eta_1,\xi_2,\eta_2,\dots</math> are independent random variables with [[Normal distribution#Standard normal distribution|standard normal distribution]]; frequencies <math>0<\lambda_1<\lambda_2<\dots</math> are a fast growing sequence; and coefficients <math>c_n>0</math> satisfy <math display="inline">\sum_n c_n < \infty.</math> The latter relation implies <math display="inline">{\mathbb E} \sum_n c_n ( |\xi_n| + |\eta_n| ) = \sum_n c_n {\mathbb E} [ |\xi_n| + |\eta_n| ] = \text{const} \cdot \sum_n c_n < \infty,</math> whence <math display="inline">\sum_n c_n ( |\xi_n| + |\eta_n| ) < \infty</math> almost surely, which ensures uniform convergence of the Fourier series almost surely, and sample continuity of <math>X.</math> [[File:Autocorrelation of a random lacunary Fourier series.svg|thumb|411px|Autocorrelation of a random lacunary Fourier series]] Its autocovariation function <math display="block"> {\mathbb E}[X_t X_{t+h}] = \sum_{n=1}^\infty c_n^2 \cos \lambda_n h </math> is nowhere monotone (see the picture), as well as the corresponding function <math>\sigma,</math> <math display="block"> \sigma(h) = \sqrt{ 2 {\mathbb E}[X_t X_t] - 2 {\mathbb E}[X_t X_{t+h}] } = 2 \sqrt{ \sum_{n=1}^\infty c_n^2 \sin^2 \frac{\lambda_n h}2 } .</math>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)