Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Gaussian process
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Continuity== For a Gaussian process, [[Continuous stochastic process#Continuity in probability|continuity in probability]] is equivalent to [[Continuous stochastic process#Mean-square continuity|mean-square continuity]]<ref>{{cite book |chapter-url=https://www.mathunion.org/fileadmin/ICM/Proceedings/ICM1974.2/ICM1974.2.ocr.pdf |title=Proceedings of the International Congress of Mathematicians |first=R. M. |last=Dudley |author-link=Richard M. Dudley |year=1975 |volume=2 |pages=143–146 |chapter=The Gaussian process and how to approach it}}</ref>{{rp|145}}{{rp|91 "Gaussian processes are discontinuous at fixed points."}}<ref name = "banerjeeGelfandSmoothness">{{cite journal | first1 = Sudipto | last1 = Banerjee | first2 = Alan E. | last2 = Gelfand | title = On smoothness properties of spatial processes | journal = Journal of Multivariate Analysis | year = 2003 | volume = 84 | issue = 1 | pages = 85–100 | doi = 10.1016/S0047-259X(02)00016-7 | url = https://doi.org/10.1016/S0047-259X(02)00016-7}}</ref> and [[Continuous stochastic process#Continuity with probability one|continuity with probability one]] is equivalent to [[Continuous stochastic process#Sample continuity|sample continuity]].<ref>{{cite book |first=R. M. |last=Dudley |title=Selected Works of R.M. Dudley |chapter=Sample Functions of the Gaussian Process |date=2010 |author-link=Richard M. Dudley |journal=[[Annals of Probability]] |volume=1 |issue=1 |pages=66–103 |doi=10.1007/978-1-4419-5821-1_13 |isbn=978-1-4419-5820-4 |chapter-url=http://projecteuclid.org/euclid.aop/1176997026 }}</ref> The latter implies, but is not implied by, continuity in probability. Continuity in probability holds if and only if the [[autocovariance|mean and autocovariance]] are continuous functions. In contrast, sample continuity was challenging even for [[#Stationarity|stationary Gaussian processes]] (as probably noted first by [[Andrey Kolmogorov]]), and more challenging for more general processes.<ref>{{cite book |last=Talagrand |first=Michel |author-link=Michel Talagrand |title=Upper and lower bounds for stochastic processes: modern methods and classical problems |year=2014 |publisher=Springer, Heidelberg |isbn=978-3-642-54074-5 |url=https://www.springer.com/gp/book/9783642540745|series=Ergebnisse der Mathematik und ihrer Grenzgebiete. 3. Folge / A Series of Modern Surveys in Mathematics }}</ref>{{rp|Sect. 2.8}} <ref>{{citation | last = Ledoux | first = Michel | editor1-last = Dobrushin | editor1-first = Roland | editor2-last = Groeneboom | editor2-first = Piet | editor3-last = Ledoux | editor3-first = Michel | contribution = Isoperimetry and Gaussian analysis | doi = 10.1007/BFb0095676 | location = Berlin | mr = 1600888 | pages = 165–294 | publisher = Springer | series = Lecture Notes in Mathematics | title = Lectures on Probability Theory and Statistics: Ecole d'Eté de Probabilités de Saint-Flour XXIV–1994 | volume = 1648 | isbn = 978-3-540-62055-6 | year = 1996}}</ref>{{rp|69,81}} <ref>{{cite book | last = Adler | first = Robert J. | title = An Introduction to Continuity, Extrema, and Related Topics for General Gaussian Processes | journal = Lecture Notes-Monograph Series | isbn = 0-940600-17-X | jstor = 4355563 | location = Hayward, California | mr = 1088478 | publisher = Institute of Mathematical Statistics | volume = 12 | year = 1990}}</ref>{{rp|80}} <ref>{{cite journal |title=Review of: Adler 1990 'An introduction to continuity...' |first=Simeon M. |last=Berman |date=1992 |journal=Mathematical Reviews|mr = 1088478}}</ref> As usual, by a sample continuous process one means a process that admits a sample continuous [[Stochastic process#Modification|modification]]. <ref name=Dudley67>{{cite journal |first=R. M. |last=Dudley |author-link=Richard M. Dudley |title=The sizes of compact subsets of Hilbert space and continuity of Gaussian processes |journal=Journal of Functional Analysis |volume=1 |issue= 3|pages=290–330 |year=1967|doi=10.1016/0022-1236(67)90017-1 |doi-access=free }}</ref>{{rp|292}} <ref name=MarcusShepp72>{{cite book |chapter-url=https://projecteuclid.org/euclid.bsmsp/1200514231 |title=Proceedings of the sixth Berkeley symposium on mathematical statistics and probability, vol. II: probability theory |first1=M.B. |last1=Marcus |first2=Lawrence A. |last2=Shepp |author-link2=Lawrence Shepp |year=1972 |publisher=Univ. California, Berkeley |pages=423–441 |chapter=Sample behavior of Gaussian processes|volume=6 |issue=2 }}</ref>{{rp|424}} ===Stationary case=== For a stationary Gaussian process <math>X=(X_t)_{t\in\R},</math> some conditions on its spectrum are sufficient for sample continuity, but fail to be necessary. A necessary and sufficient condition, sometimes called Dudley–Fernique theorem, involves the function <math>\sigma</math> defined by <math display="block"> \sigma(h) = \sqrt{ {\mathbb E} \big[ X(t+h) - X(t) \big]^2 } </math> (the right-hand side does not depend on <math>t</math> due to stationarity). Continuity of <math>X</math> in probability is equivalent to continuity of <math>\sigma</math> at <math>0.</math> When convergence of <math>\sigma(h)</math> to <math>0</math> (as <math>h\to 0</math>) is too slow, sample continuity of <math>X</math> may fail. Convergence of the following integrals matters: <math display="block"> I(\sigma) = \int_0^1 \frac{ \sigma(h) }{ h \sqrt{ \log(1/h) } } \, dh = \int_0^\infty 2\sigma( e^{-x^2}) \, dx ,</math> these two integrals being equal according to [[integration by substitution]] <math display="inline"> h = e^{-x^2}, </math> <math display="inline">x = \sqrt{\log(1/h)} .</math> The first integrand need not be bounded as <math>h\to 0+,</math> thus the integral may converge (<math>I(\sigma)<\infty</math>) or diverge (<math>I(\sigma)=\infty</math>). Taking for example <math display="inline">\sigma( e^{-x^2}) = \tfrac{1}{x^a}</math> for large <math>x,</math> that is, <math display="inline">\sigma(h) = (\log(1/h))^{-a/2}</math> for small <math>h,</math> one obtains <math>I(\sigma)<\infty</math> when <math>a>1,</math> and <math>I(\sigma)=\infty</math> when <math>0 < a\le 1.</math> In these two cases the function <math>\sigma</math> is increasing on <math>[0,\infty),</math> but generally it is not. Moreover, the condition {{block indent | em = 1.5 | text ={{vanchor|(*)}} there exists <math>\varepsilon > 0</math> such that <math>\sigma</math> is monotone on <math>[0,\varepsilon]</math>}} does not follow from continuity of <math>\sigma</math> and the evident relations <math>\sigma(h) \ge 0</math> (for all <math>h</math>) and <math>\sigma(0) = 0.</math> {{math theorem | name = Theorem 1 | math_statement = Let <math>\sigma</math> be continuous and satisfy [[#(*)|(*)]]. Then the condition <math>I(\sigma) < \infty</math> is necessary and sufficient for sample continuity of <math>X.</math>}} Some history.<ref name=MarcusShepp72 />{{rp|424}} Sufficiency was announced by [[Xavier Fernique]] in 1964, but the first proof was published by [[Richard M. Dudley]] in 1967.<ref name=Dudley67 />{{rp|Theorem 7.1}} Necessity was proved by Michael B. Marcus and [[Lawrence Shepp]] in 1970.<ref name=MarcusShepp70>{{cite journal |first1=Michael B. |last1=Marcus |first2=Lawrence A. |last2=Shepp |author-link2=Lawrence Shepp |title=Continuity of Gaussian processes |journal=[[Transactions of the American Mathematical Society]] |volume=151 |issue= 2|pages=377–391 |year=1970 |doi=10.1090/s0002-9947-1970-0264749-1 |jstor=1995502 |doi-access=free }}</ref>{{rp|380}} There exist sample continuous processes <math>X</math> such that <math>I(\sigma)=\infty;</math> they violate condition [[#(*)|(*)]]. An example found by Marcus and Shepp <ref name=MarcusShepp70 />{{rp|387}} is a random [[Lacunary function#Lacunary trigonometric series|lacunary Fourier series]] <math display="block"> X_t = \sum_{n=1}^\infty c_n ( \xi_n \cos \lambda_n t + \eta_n \sin \lambda_n t ) ,</math> where <math>\xi_1,\eta_1,\xi_2,\eta_2,\dots</math> are independent random variables with [[Normal distribution#Standard normal distribution|standard normal distribution]]; frequencies <math>0<\lambda_1<\lambda_2<\dots</math> are a fast growing sequence; and coefficients <math>c_n>0</math> satisfy <math display="inline">\sum_n c_n < \infty.</math> The latter relation implies <math display="inline">{\mathbb E} \sum_n c_n ( |\xi_n| + |\eta_n| ) = \sum_n c_n {\mathbb E} [ |\xi_n| + |\eta_n| ] = \text{const} \cdot \sum_n c_n < \infty,</math> whence <math display="inline">\sum_n c_n ( |\xi_n| + |\eta_n| ) < \infty</math> almost surely, which ensures uniform convergence of the Fourier series almost surely, and sample continuity of <math>X.</math> [[File:Autocorrelation of a random lacunary Fourier series.svg|thumb|411px|Autocorrelation of a random lacunary Fourier series]] Its autocovariation function <math display="block"> {\mathbb E}[X_t X_{t+h}] = \sum_{n=1}^\infty c_n^2 \cos \lambda_n h </math> is nowhere monotone (see the picture), as well as the corresponding function <math>\sigma,</math> <math display="block"> \sigma(h) = \sqrt{ 2 {\mathbb E}[X_t X_t] - 2 {\mathbb E}[X_t X_{t+h}] } = 2 \sqrt{ \sum_{n=1}^\infty c_n^2 \sin^2 \frac{\lambda_n h}2 } .</math>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)