Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Central limit theorem
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Convergence to the limit=== The central limit theorem gives only an [[asymptotic distribution]]. As an approximation for a finite number of observations, it provides a reasonable approximation only when close to the peak of the normal distribution; it requires a very large number of observations to stretch into the tails.{{citation needed|reason=Not immediately obvious, I didn't find a source via google|date=July 2016}} The convergence in the central limit theorem is [[uniform convergence|uniform]] because the limiting cumulative distribution function is continuous. If the third central [[Moment (mathematics)|moment]] <math display="inline">\operatorname{E}\left[(X_1 - \mu)^3\right]</math> exists and is finite, then the speed of convergence is at least on the order of <math display="inline">1 / \sqrt{n}</math> (see [[Berry–Esseen theorem]]). [[Stein's method]]<ref name="stein1972">{{Cite journal| last = Stein |first=C. |author-link=Charles Stein (statistician)| title = A bound for the error in the normal approximation to the distribution of a sum of dependent random variables| journal = Proceedings of the Sixth Berkeley Symposium on Mathematical Statistics and Probability| pages= 583–602| year = 1972|volume=6 |issue=2 | mr=402873 | zbl = 0278.60026| url=http://projecteuclid.org/euclid.bsmsp/1200514239 }}</ref> can be used not only to prove the central limit theorem, but also to provide bounds on the rates of convergence for selected metrics.<ref>{{Cite book| title = Normal approximation by Stein's method| publisher = Springer| year = 2011|last1=Chen |first1=L. H. Y. |last2=Goldstein |first2=L. |last3=Shao |first3=Q. M. |isbn = 978-3-642-15006-7}}</ref> The convergence to the normal distribution is monotonic, in the sense that the [[information entropy|entropy]] of <math display="inline">Z_n</math> increases [[monotonic function|monotonically]] to that of the normal distribution.<ref name=ABBN/> The central limit theorem applies in particular to sums of independent and identically distributed [[discrete random variable]]s. A sum of [[discrete random variable]]s is still a [[discrete random variable]], so that we are confronted with a sequence of [[discrete random variable]]s whose cumulative probability distribution function converges towards a cumulative probability distribution function corresponding to a continuous variable (namely that of the [[normal distribution]]). This means that if we build a [[histogram]] of the realizations of the sum of {{mvar|n}} independent identical discrete variables, the piecewise-linear curve that joins the centers of the upper faces of the rectangles forming the histogram converges toward a Gaussian curve as {{mvar|n}} approaches infinity; this relation is known as [[de Moivre–Laplace theorem]]. The [[binomial distribution]] article details such an application of the central limit theorem in the simple case of a discrete variable taking only two possible values.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)