Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Fourier transform
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Uncertainty principle === {{Further|Uncertainty principle}} Generally speaking, the more concentrated {{math|''f''(''x'')}} is, the more spread out its Fourier transform {{math|''f̂''(''ξ'')}} must be. In particular, the scaling property of the Fourier transform may be seen as saying: if we squeeze a function in {{mvar|x}}, its Fourier transform stretches out in {{mvar|ξ}}. It is not possible to arbitrarily concentrate both a function and its Fourier transform. The trade-off between the compaction of a function and its Fourier transform can be formalized in the form of an [[uncertainty principle]] by viewing a function and its Fourier transform as [[conjugate variables]] with respect to the [[symplectic form]] on the [[time–frequency representation|time–frequency domain]]: from the point of view of the [[linear canonical transformation]], the Fourier transform is rotation by 90° in the time–frequency domain, and preserves the [[Symplectic vector space|symplectic form]]. Suppose {{math|''f''(''x'')}} is an integrable and [[square-integrable]] function. Without loss of generality, assume that {{math|''f''(''x'')}} is normalized: <math display="block">\int_{-\infty}^\infty |f(x)|^2 \,dx=1.</math> It follows from the [[Plancherel theorem]] that {{math|''f̂''(''ξ'')}} is also normalized. The spread around {{math|''x'' {{=}} 0}} may be measured by the ''dispersion about zero'' defined by<ref>{{harvnb|Pinsky|2002|loc=chpt. 2.4.3 The Uncertainty Principle}}</ref> <math display="block">D_0(f)=\int_{-\infty}^\infty x^2|f(x)|^2\,dx.</math> In probability terms, this is the [[Moment (mathematics)|second moment]] of {{math|{{abs|''f''(''x'')}}<sup>2</sup>}} about zero. The uncertainty principle states that, if {{math|''f''(''x'')}} is absolutely continuous and the functions {{math|''x''·''f''(''x'')}} and {{math|''f''{{′}}(''x'')}} are square integrable, then <math display="block">D_0(f)D_0(\hat{f}) \geq \frac{1}{16\pi^2}.</math> The equality is attained only in the case <math display="block">\begin{align} f(x) &= C_1 \, e^{-\pi \frac{x^2}{\sigma^2} }\\ \therefore \hat{f}(\xi) &= \sigma C_1 \, e^{-\pi\sigma^2\xi^2} \end{align} </math> where {{math|''σ'' > 0}} is arbitrary and {{math|1=''C''<sub>1</sub> = {{sfrac|{{radic|2|4}}|{{sqrt|''σ''}}}}}} so that {{mvar|f}} is {{math|''L''<sup>2</sup>}}-normalized. In other words, where {{mvar|f}} is a (normalized) [[Gaussian function]] with variance {{math|''σ''<sup>2</sup>/2{{pi}}}}, centered at zero, and its Fourier transform is a Gaussian function with variance {{math|''σ''<sup>−2</sup>/2{{pi}}}}. Gaussian functions are examples of [[Schwartz function]]s (see the discussion on tempered distributions below). In fact, this inequality implies that: <math display="block">\left(\int_{-\infty}^\infty (x-x_0)^2|f(x)|^2\,dx\right)\left(\int_{-\infty}^\infty(\xi-\xi_0)^2\left|\hat{f}(\xi)\right|^2\,d\xi\right)\geq \frac{1}{16\pi^2}, \quad \forall x_0, \xi_0 \in \mathbb{R}.</math> In [[quantum mechanics]], the [[momentum]] and position [[wave function]]s are Fourier transform pairs, up to a factor of the [[Planck constant]]. With this constant properly taken into account, the inequality above becomes the statement of the [[Heisenberg uncertainty principle]].<ref>{{harvnb|Stein|Shakarchi|2003|loc= chpt. 5.4 The Heisenberg uncertainty principle}}</ref> A stronger uncertainty principle is the [[Hirschman uncertainty|Hirschman uncertainty principle]], which is expressed as: <math display="block">H\left(\left|f\right|^2\right)+H\left(\left|\hat{f}\right|^2\right)\ge \log\left(\frac{e}{2}\right)</math> where {{math|''H''(''p'')}} is the [[differential entropy]] of the [[probability density function]] {{math|''p''(''x'')}}: <math display="block">H(p) = -\int_{-\infty}^\infty p(x)\log\bigl(p(x)\bigr) \, dx</math> where the logarithms may be in any base that is consistent. The equality is attained for a Gaussian, as in the previous case.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)