Law of large numbers

strong law of large numbersweak law of large numbersBernoulli's Golden TheoremLaws of large numbers approachesapproaches one-halfexpectationlaw of averageslong runPoisson's law of large numbers
In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times.wikipedia
209 Related Articles

Expected value

expectationexpectedmean
According to the law, the average of the results obtained from a large number of trials should be close to the expected value, and will tend to become closer to the expected value as more trials are performed. Let X be a random variable with finite expected value μ and finite non-zero variance σ 2.
For example, the expected value of rolling a six-sided die is 3.5, because the average of all the numbers that come up converges to 3.5 as the number of rolls approaches infinity (see for details).

Probability theory

theory of probabilityprobabilityprobability theorist
In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times.
Two major results in probability theory describing such behaviour are the law of large numbers and the central limit theorem.

Monte Carlo method

Monte CarloMonte Carlo simulationMonte Carlo methods
Another good example about LLN is Monte Carlo method.
By the law of large numbers, integrals described by the expected value of some random variable can be approximated by taking the empirical mean (a.k.a. the sample mean) of independent samples of the variable.

Ars Conjectandi

1713
It took him over 20 years to develop a sufficiently rigorous mathematical proof which was published in his Ars Conjectandi (The Art of Conjecturing) in 1713.
The seminal work consolidated, apart from many combinatorial topics, many central ideas in probability theory, such as the very first version of the law of large numbers: indeed, it is widely regarded as the founding work of that subject.

Theorem

theoremspropositionconverse
In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times.

Almost surely

almost alwaysalmost surezero probability
In particular, the proportion of heads after n flips will almost surely converge to 1/2 as n approaches infinity.
Some examples of the use of this concept include the strong and uniform versions of the law of large numbers, and the continuity of the paths of Brownian motion.

Jacob Bernoulli

Jakob BernoulliBernoulliJames Bernoulli
A special form of the LLN (for a binary random variable) was first proved by Jacob Bernoulli.
However, his most important contribution was in the field of probability, where he derived the first version of the law of large numbers in his work Ars Conjectandi.

Cauchy distribution

LorentzianCauchyLorentzian distribution
For instance, the average of the results from Cauchy distribution or some Pareto distribution (α
Various results in probability theory about expected values, such as the strong law of large numbers, fail to hold for the Cauchy distribution.

Variance

sample variancepopulation variancevariability
Based on the assumption of finite variance (for all i) and no correlation between random variables, the variance of the average of n random variables Let X be a random variable with finite expected value μ and finite non-zero variance σ 2.
This makes clear that the sample mean of correlated variables does not generally converge to the population mean, even though the law of large numbers states that the sample mean will converge for independent variables.

Chebyshev's inequality

Bienaymé–Chebyshev inequalityChebyshev inequalityAn inequality on location and scale parameters
Chebyshev's inequality.
For example, it can be used to prove the weak law of large numbers.

Pafnuty Chebyshev

ChebyshevP. L. ChebyshevChebychev
After Bernoulli and Poisson published their efforts, other mathematicians also contributed to refinement of the law, including Chebyshev, Markov, Borel, Cantelli and Kolmogorov and Khinchin.
The Chebyshev inequality is used to prove the weak law of large numbers.

Asymptotic equipartition property

Shannon–McMillan–Breiman theorem
(This is a consequence of the law of large numbers and ergodic theory.) Although there are individual outcomes which have a higher probability than any outcome in this set, the vast number of outcomes in the set almost guarantees that the outcome will come from the set.

Convergence of random variables

convergence in distributionconverges in distributionconvergence in probability
For interpretation of these modes, see Convergence of random variables.
This result is known as the weak law of large numbers.

Central limit theorem

Lyapunov's central limit theoremlimit theoremscentral limit
By the law of large numbers, the sample averages converge in probability and almost surely to the expected value µ as

Law of averages

averaging theory
While there is a real theorem that a random variable will reflect its underlying probability over a very large sample, the law of averages typically assumes that unnatural short-term "balance" must occur.

Characteristic function (probability theory)

characteristic functioncharacteristic functionscharacteristic function:
By Taylor's theorem for complex functions, the characteristic function of any random variable, X, with finite mean μ, can be written as
This theorem is frequently used to prove the law of large numbers, and the central limit theorem.

Law of the iterated logarithm

The law of iterated logarithms operates “in between” the law of large numbers and the central limit theorem.

Random variable

random variablesrandom variationrandom
Let X be a random variable with finite expected value μ and finite non-zero variance σ 2.
A significant theme in mathematical statistics consists of obtaining convergence results for certain sequences of random variables; for instance the law of large numbers and the central limit theorem.

Regression toward the mean

regression to the meanRegression towards the meanmean regression
Similarly, the law of large numbers states that in the long term, the average will tend towards the expected value, but makes no statement about individual trials.

Average

Rushing averageReceiving averagemean
According to the law, the average of the results obtained from a large number of trials should be close to the expected value, and will tend to become closer to the expected value as more trials are performed.

Roulette

roulette wheelAmerican roulettebetting wheel
For example, while a casino may lose money in a single spin of the roulette wheel, its earnings will tend towards a predictable percentage over a large number of spins.

Gambler's fallacy

D'Alembert systemexampleLe Grande Casino
There is no principle that a small number of observations will coincide with the expected value or that a streak of one value will immediately be "balanced" by the others (see the gambler's fallacy).

Probability

probabilisticprobabilitieschance
For example, a single roll of a fair, six-sided dice produces one of the numbers 1, 2, 3, 4, 5, or 6, each with equal probability.

Sample mean and covariance

sample meansample covariancesample covariance matrix
: According to the law of large numbers, if a large number of six-sided dice are rolled, the average of their values (sometimes called the sample mean) is likely to be close to 3.5, with the precision increasing as more dice are rolled.