Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Entropic uncertainty
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Sketch of proof== The proof of this tight inequality depends on the so-called '''(''q'', ''p'')-norm''' of the Fourier transformation. (Establishing this norm is the most difficult part of the proof.) From this norm, one is able to establish a lower bound on the sum of the (differential) [[Rényi entropy|Rényi entropies]], {{math| ''H<sub>α</sub>({{!}}f{{!}}²)+H<sub>β</sub>({{!}}g{{!}}²)'' }}, where {{math|''1/α + 1/β'' {{=}} 2}}, which generalize the Shannon entropies. For simplicity, we consider this inequality only in one dimension; the extension to multiple dimensions is straightforward and can be found in the literature cited. ===Babenko–Beckner inequality=== The '''(''q'', ''p'')-norm''' of the Fourier transform is defined to be<ref name=Bialynicki>{{Cite journal | doi = 10.1103/PhysRevA.74.052101| title = Formulation of the uncertainty relations in terms of the Rényi entropies| journal = Physical Review A| volume = 74| issue = 5| page = 052101| year = 2006| last1 = Bialynicki-Birula | first1 = I. |arxiv = quant-ph/0608116 |bibcode = 2006PhRvA..74e2101B | s2cid = 19123961}}</ref> :<math>\|\mathcal F\|_{q,p} = \sup_{f\in L^p(\mathbb R)} \frac{\|\mathcal Ff\|_q}{\|f\|_p},</math> where <math>1 < p \le 2~,</math> and <math>\frac 1 p + \frac 1 q = 1.</math> In 1961, Babenko<ref>K.I. Babenko. ''An inequality in the theory of Fourier integrals.'' Izv. Akad. Nauk SSSR, Ser. Mat. '''25''' (1961) pp. 531–542 English transl., Amer. Math. Soc. Transl. (2) '''44''', pp. 115-128</ref> found this norm for ''even'' integer values of ''q''. Finally, in 1975, using [[Hermite functions]] as eigenfunctions of the Fourier transform, Beckner<ref name=Beckner/> proved that the value of this norm (in one dimension) for all ''q'' ≥ 2 is :<math>\|\mathcal F\|_{q,p} = \sqrt{p^{1/p}/q^{1/q}}.</math> Thus we have the '''[[Babenko–Beckner inequality]]''' that :<math>\|\mathcal Ff\|_q \le \left(p^{1/p}/q^{1/q}\right)^{1/2} \|f\|_p.</math> ===Rényi entropy bound=== From this inequality, an expression of the uncertainty principle in terms of the [[Rényi entropy]] can be derived.<ref name=Bialynicki/><ref>H.P. Heinig and M. Smith, ''Extensions of the Heisenberg–Weil inequality.'' Internat. J. Math. & Math. Sci., Vol. 9, No. 1 (1986) pp. 185–192. [http://www.hindawi.com/GetArticle.aspx?doi=10.1155/S0161171286000212]</ref> Let <math>g=\mathcal Ff, \, 2\alpha=p, \, 2\beta=q,</math> so that <math>\frac1\alpha+\frac1\beta=2</math> and <math>\frac12\le\alpha\le1\le\beta</math>, we have :<math>\left(\int_{\mathbb R} |g(y)|^{2\beta}\,dy\right)^{1/2\beta} \le \frac{(2\alpha)^{1/4\alpha}}{(2\beta)^{1/4\beta}} \left(\int_{\mathbb R} |f(x)|^{2\alpha}\,dx\right)^{1/2\alpha}. </math> Squaring both sides and taking the logarithm, we get :<math>\frac 1\beta \log\left(\int_{\mathbb R} |g(y)|^{2\beta}\,dy\right) \le \frac 1 2 \log\frac{(2\alpha)^{1/\alpha}}{(2\beta)^{1/\beta}} + \frac 1\alpha \log \left(\int_{\mathbb R} |f(x)|^{2\alpha}\,dx\right). </math> We can rewrite the condition on <math>\alpha, \beta</math> as :<math>\alpha(1-\beta)+\beta(1-\alpha)=0</math> Assume <math>\alpha,\beta\ne1</math>, then we multiply both sides by the negative :<math>\frac{\beta}{1-\beta}=-\frac{\alpha}{1-\alpha}</math> to get :<math>\frac {1}{1-\beta} \log\left(\int_{\mathbb R} |g(y)|^{2\beta}\,dy\right) \ge \frac\alpha{2(\alpha-1)}\log\frac{(2\alpha)^{1/\alpha}}{(2\beta)^{1/\beta}} - \frac{1}{1-\alpha} \log \left(\int_{\mathbb R} |f(x)|^{2\alpha}\,dx\right) ~. </math> Rearranging terms yields an inequality in terms of the sum of Rényi entropies, :<math>\frac{1}{1-\alpha} \log \left(\int_{\mathbb R} |f(x)|^{2\alpha}\,dx\right) + \frac {1}{1-\beta} \log\left(\int_{\mathbb R} |g(y)|^{2\beta}\,dy\right) \ge \frac\alpha{2(\alpha-1)}\log\frac{(2\alpha)^{1/\alpha}}{(2\beta)^{1/\beta}}; </math> :<math> H_\alpha(|f|^2) + H_\beta(|g|^2) \ge \frac 1 2 \left(\frac{\log\alpha}{\alpha-1}+\frac{\log\beta}{\beta-1}\right) - \log 2</math> ====Right-hand side==== :<math>\frac\alpha{2(\alpha-1)}\log\frac{(2\alpha)^{1/\alpha}}{(2\beta)^{1/\beta}}</math> :<math>=\frac12\left[\frac{\alpha}{\alpha-1}\log(2\alpha)^{1/\alpha} + \frac{\beta}{\beta-1}\log(2\beta)^{1/\beta}\right]</math> :<math>=\frac12\left[\frac{\log2\alpha}{\alpha-1} + \frac{\log2\beta}{\beta-1}\right]</math> :<math>=\frac12\left[\frac{\log\alpha}{\alpha-1} + \frac{\log\beta}{\beta-1}\right] + \frac12\log2\left[\frac{1}{\alpha-1} + \frac{1}{\beta-1}\right]</math> :<math>=\frac12\left[\frac{\log\alpha}{\alpha-1} + \frac{\log\beta}{\beta-1}\right] + \frac12\log2\left[\frac{1}{\alpha-1} + \frac{1}{\beta-1} - \frac{\alpha}{\alpha-1} - \frac{\beta}{\beta-1}\right]</math> :<math>=\frac12\left[\frac{\log\alpha}{\alpha-1} + \frac{\log\beta}{\beta-1}\right] + \frac12\log2\left[-2\right]</math> :<math>=\frac12\left[\frac{\log\alpha}{\alpha-1} + \frac{\log\beta}{\beta-1}\right] - \log2</math> ===Shannon entropy bound=== Taking the limit of this last inequality as <math>\alpha, \, \beta \to 1</math> and the substitutions <math>\Alpha=\alpha-1, \Beta=\beta-1</math> yields the less general Shannon entropy inequality, :<math>H(|f|^2) + H(|g|^2) \ge \log\frac e 2,\quad\textrm{where}\quad g(y) \approx \int_{\mathbb R} e^{-2\pi ixy}f(x)\,dx~,</math> valid for any base of logarithm, as long as we choose an appropriate unit of information, [[bit]], [[Nat (unit)|nat]], etc. The constant will be different, though, for a different normalization of the Fourier transform, (such as is usually used in physics, with normalizations chosen so that ''ħ''=1 ), i.e., :<math>H(|f|^2) + H(|g|^2) \ge \log(\pi e)\quad\textrm{for}\quad g(y) \approx \frac 1{\sqrt{2\pi}}\int_{\mathbb R} e^{-ixy}f(x)\,dx~.</math> In this case, the dilation of the Fourier transform absolute squared by a factor of 2{{mvar|π}} simply adds log(2{{mvar|π}}) to its entropy.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)