Central limit theorem

Lyapunov's central limit theoremlimit theoremscentral limitA proof of the central limit theoremCentral Limit Theorem,central limit-likeCLTLindeberg central limit theoremLindeberg's Central Limit TheoremLindeberg–Feller theorem
In probability theory, the central limit theorem (CLT) establishes that, in some situations, when independent random variables are added, their properly normalized sum tends toward a normal distribution (informally a "bell curve") even if the original variables themselves are not normally distributed.wikipedia
309 Related Articles

Normal distribution

normally distributedGaussian distributionnormal
In probability theory, the central limit theorem (CLT) establishes that, in some situations, when independent random variables are added, their properly normalized sum tends toward a normal distribution (informally a "bell curve") even if the original variables themselves are not normally distributed. The earliest version of this theorem, that the normal distribution may be used as an approximation to the binomial distribution, is now known as the de Moivre–Laplace theorem.
The normal distribution is useful because of the central limit theorem.

Probability theory

theory of probabilityprobabilityprobability theorist
In probability theory, the central limit theorem (CLT) establishes that, in some situations, when independent random variables are added, their properly normalized sum tends toward a normal distribution (informally a "bell curve") even if the original variables themselves are not normally distributed.
Two major results in probability theory describing such behaviour are the law of large numbers and the central limit theorem.

De Moivre–Laplace theorem

Theorem of de Moivre–Laplaceapproximately normalde Moivre-Laplace theorem
The earliest version of this theorem, that the normal distribution may be used as an approximation to the binomial distribution, is now known as the de Moivre–Laplace theorem.
In probability theory, the de Moivre–Laplace theorem, which is a special case of the central limit theorem, states that the normal distribution may be used as an approximation to the binomial distribution under certain conditions.

Stable distribution

stable distributionsGeneralized Central Limit TheoremLévy alpha-stable distribution
(and therefore having infinite variance) will tend to an alpha-stable distribution with stability parameter (or index of stability) of α as the number of variables grows.
By the classical central limit theorem the properly normed sum of a set of random variables, each with finite variance, will tend toward a normal distribution as the number of variables increases.

Multivariate normal distribution

multivariate normalbivariate normal distributionjointly normally distributed
The multidimensional central limit theorem states that when scaled, sums converge to a multivariate normal distribution.
Its importance derives mainly from the multivariate central limit theorem.

Convergence of random variables

convergence in distributionconverges in distributionconvergence in probability
By the law of large numbers, the sample averages converge in probability and almost surely to the expected value µ as
Other forms of convergence are important in other useful theorems, including the central limit theorem.

Berry–Esseen theorem

Berry–Esséen theoremBerry-EsseenBerry-Esseen theorem
The rate of convergence is given by the following Berry–Esseen type result:
In probability theory, the central limit theorem states that, under certain circumstances, the probability distribution of the scaled mean of a random sample converges to a normal distribution as the sample size increases to infinity.

Jarl Waldemar Lindeberg

LindebergLindeberg, Jarl WaldemarJ. W. Lindeberg
In the same setting and with the same notation as above, the Lyapunov condition can be replaced with the following weaker one (from Lindeberg in 1920).
Jarl Waldemar Lindeberg (4 August 1876, Helsinki – 24 December 1932, Helsinki) was a Finnish mathematician known for work on the central limit theorem.

Variance

sample variancepopulation variancevariability
be a random sample of size n—that is, a sequence of independent and identically distributed (i.i.d.) random variables drawn from a distribution of expected value given by µ and finite variance given by
This formula for the variance of the mean is used in the definition of the standard error of the sample mean, which is used in the central limit theorem.

Independent and identically distributed random variables

independent and identically distributedi.i.d.iid
be a random sample of size n—that is, a sequence of independent and identically distributed (i.i.d.) random variables drawn from a distribution of expected value given by µ and finite variance given by In its common form, the random variables must be identically distributed. In contrast, the sum of a number of i.i.d. random variables with power law tail distributions decreasing as
The i.i.d. assumption is important in the classical form of the central limit theorem, which states that the probability distribution of the sum (or average) of i.i.d. variables with finite variance approaches a normal distribution.

Characteristic function (probability theory)

characteristic functioncharacteristic functionscharacteristic function:
The central limit theorem has a simple proof using characteristic functions.
The characteristic function approach is particularly useful in analysis of linear combinations of independent random variables: a classical proof of the Central Limit Theorem uses characteristic functions and Lévy's continuity theorem.

Stein's method

Chen–Stein methodStein-Chen method
Stein's method can be used not only to prove the central limit theorem, but also to provide bounds on the rates of convergence for selected metrics.
It was introduced by Charles Stein, who first published it in 1972, to obtain a bound between the distribution of a sum of m-dependent sequence of random variables and a standard normal distribution in the Kolmogorov (uniform) metric and hence to prove not only a central limit theorem, but also bounds on the rates of convergence for the given metric.

Binomial distribution

binomialbinomial modelBinomial probability
The earliest version of this theorem, that the normal distribution may be used as an approximation to the binomial distribution, is now known as the de Moivre–Laplace theorem.
Nowadays, it can be seen as a consequence of the central limit theorem since B(n, p) is a sum of n independent, identically distributed Bernoulli variables with parameter p.

Log-normal distribution

lognormallog-normallognormal distribution
Therefore, when the logarithm of a product of random variables that take only positive values approaches a normal distribution, the product itself approaches a log-normal distribution.
This is justified by considering the central limit theorem in the log domain.

Cauchy distribution

LorentzianCauchyLorentzian distribution
Clearly, the normal distribution is stable, but there are also other stable distributions, such as the Cauchy distribution, for which the mean or variance are not defined.
As such, Laplace's use of the Central Limit Theorem with such a distribution was inappropriate, as it assumed a finite mean and variance.

Power law

power-lawscaling lawpower laws
In contrast, the sum of a number of i.i.d. random variables with power law tail distributions decreasing as
These models have a fundamental role as foci of mathematical convergence similar to the role that the normal distribution has as a focus in the central limit theorem.

Aleksandr Lyapunov

LyapunovAleksandr Mikhailovich LyapunovAlexander Lyapunov
The theorem is named after Russian mathematician Aleksandr Lyapunov.
In the theory of probability, he generalised the works of Chebyshev and Markov, and proved the Central Limit Theorem under more general conditions than his predecessors.

Asymptotic distribution

asymptotically normalasymptotic normalitylimiting distribution
The central limit theorem gives only an asymptotic distribution.
In particular, the central limit theorem provides an example where the asymptotic distribution is the normal distribution.

Random walk

random walkssimple random walkRandom walks on graphs
The central limit theorem may be established for the simple random walk on a crystal lattice (an infinite-fold abelian covering graph over a finite graph), and is used for design of crystal structures.
The central limit theorem and the law of the iterated logarithm describe important aspects of the behavior of simple random walks on \mathbb Z. In particular, the former entails that as n increases, the probabilities (proportional to the numbers in each row) approach a normal distribution.

Law of the iterated logarithm

The law of the iterated logarithm specifies what is happening "in between" the law of large numbers and the central limit theorem.
The law of iterated logarithms operates “in between” the law of large numbers and the central limit theorem.

Law of large numbers

strong law of large numbersweak law of large numbersBernoulli's Golden Theorem
By the law of large numbers, the sample averages converge in probability and almost surely to the expected value µ as The law of the iterated logarithm specifies what is happening "in between" the law of large numbers and the central limit theorem.

Alan Turing

TuringAlan M. TuringAlan Mathison Turing
A curious footnote to the history of the Central Limit Theorem is that a proof of a result similar to the 1922 Lindeberg CLT was the subject of Alan Turing's 1934 Fellowship Dissertation for King's College at the University of Cambridge.
In 1935, at the age of 22, he was elected a fellow of King's on the strength of a dissertation in which he proved the central limit theorem.

Central limit theorem for directional statistics

In probability theory, the central limit theorem states conditions under which the average of a sufficiently large number of independent random variables, each with finite mean and variance, will be approximately normally distributed.

Lévy's continuity theorem

Lévy continuity theoremLévy's convergence theoremcontinuity theorem
, which implies through Lévy's continuity theorem that the distribution of Z n will approach
This theorem is the basis for one approach to prove the central limit theorem and it is one of the major theorems concerning characteristic functions.

Benford's law

Newcomb-Benford lawBenford lawBenford’s Law
(More technically, the central limit theorem says that multiplying more and more random variables will create a log-normal distribution with larger and larger variance, so eventually it covers many orders of magnitude almost uniformly.) To be sure of approximate agreement with Benford's Law, the distribution has to be approximately invariant when scaled up by any factor up to 10; a lognormally distributed data set with wide dispersion would have this approximate property.