Variance

sample variancepopulation variancevariabilityscattervariationBienaymé formulaSemivariancestatistical variancevaryas measured discretely
In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its mean.wikipedia
738 Related Articles

Random variable

random variablesrandom variationrandom
In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its mean.
In this case, the structure of the real numbers makes it possible to define quantities such as the expected value and variance of a random variable, its cumulative distribution function, and the moments of its distribution.

Algorithms for calculating variance

Numerically stable algorithmsnumerically stable alternativesparallel algorithm
There exist numerically stable alternatives.
A key difficulty in the design of good algorithms for this problem is that formulas for the variance may involve sums of squares, which can lead to numerical instability as well as to arithmetic overflow when dealing with large values.

Statistics

statisticalstatistical analysisstatistician
In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its mean.
Fisher's most important publications were his 1918 seminal paper The Correlation between Relatives on the Supposition of Mendelian Inheritance, which was the first to use the statistical term, variance, his classic 1925 work Statistical Methods for Research Workers and his 1935 The Design of Experiments, where he developed rigorous design of experiments models.

Covariance

covariantcovariationcovary
The variance is the square of the standard deviation, the second central moment of a distribution, and the covariance of the random variable with itself, and it is often represented by \sigma^2, s^2, or.
The variance is a special case of the covariance in which the two variables are identical (that is, in which one variable always takes the same value as the other):

Normal distribution

normally distributedGaussian distributionnormal
In the case that Y i are independent observations from a normal distribution, Cochran's theorem shows that s 2 follows a scaled chi-squared distribution:
In its most general form, under some conditions (which include finite variance), it states that averages of samples of observations of random variables independently drawn from the same distribution converge in distribution to the normal, that is, they become normally distributed when the number of observations is sufficiently large.

Exponential distribution

exponentialexponentially distributedexponentially
The exponential distribution with parameter λ is a continuous distribution whose probability density function is given by
The variance of X is given by

Cauchy distribution

LorentzianCauchyLorentzian distribution
If a distribution does not have a finite expected value, as is the case for the Cauchy distribution, then the variance cannot be finite either.
The Cauchy distribution is often used in statistics as the canonical example of a "pathological" distribution since both its expected value and its variance are undefined.

Descriptive statistics

descriptivedescriptive statisticstatistics
Variance has a central role in statistics, where some ideas that use it include descriptive statistics, statistical inference, hypothesis testing, goodness of fit, and Monte Carlo sampling.
Measures of central tendency include the mean, median and mode, while measures of variability include the standard deviation (or variance), the minimum and maximum values of the variables, kurtosis and skewness.

Geometric distribution

geometricgeometrically distributed geometrically distributed
The expected value, expected number of failures before the first success, of a geometrically distributed random variable X is 1/p and the variance is (1 − p)/p 2 :

Expected value

expectationexpectedmean
In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its mean.
The law of large numbers demonstrates (under fairly mild conditions) that, as the size of the sample gets larger, the variance of this estimate gets smaller.

Poisson distribution

PoissonPoisson-distributedPoissonian
If N has a Poisson distribution, then E(N) = Var(N) with estimator N=n.
The positive real number λ is equal to the expected value of X and also to its variance

Central moment

moment about the meancentral momentsmoments about the mean
The variance is the square of the standard deviation, the second central moment of a distribution, and the covariance of the random variable with itself, and it is often represented by \sigma^2, s^2, or.
For the cases n = 2, 3, 4 — which are of most interest because of the relations to variance, skewness, and kurtosis, respectively — this formula becomes (noting that and \mu'_0=1):,

Central limit theorem

Lyapunov's central limit theoremlimit theoremscentral limit
This formula for the variance of the mean is used in the definition of the standard error of the sample mean, which is used in the central limit theorem.
be a random sample of size n—that is, a sequence of independent and identically distributed (i.i.d.) random variables drawn from a distribution of expected value given by µ and finite variance given by

Uniform distribution (continuous)

uniform distributionuniformuniformly distributed
For a random variable following this distribution, the expected value is then m 1 = (a + b)/2 and the variance is

Irénée-Jules Bienaymé

BienayméBienaymé, Irénée-JulesI. J. Bienaymé
This statement is called the Bienaymé formula and was discovered in 1853.
In particular, he formulated the Bienaymé–Chebyshev inequality concerning the law of large numbers and the Bienaymé formula for the variance of a sum of uncorrelated random variables.

Analysis of variance

ANOVAanalysis of variance (ANOVA)corrected the means
A similar formula is applied in analysis of variance, where the corresponding formula is
The ANOVA is based on the law of total variance, where the observed variance in a particular variable is partitioned into components attributable to different sources of variation.

Standard error

SEstandard errorsstandard error of the mean
This formula for the variance of the mean is used in the definition of the standard error of the sample mean, which is used in the central limit theorem.
This forms a distribution of different means, and this distribution has its own mean and variance.

Uncorrelatedness (probability theory)

uncorrelated
One reason for the use of the variance in preference to other measures of dispersion is that the variance of the sum (or the difference) of uncorrelated random variables is the sum of their variances:
Uncorrelated random variables have a Pearson correlation coefficient of zero, except in the trivial case when either variable has zero variance (is a constant).

Pareto distribution

Paretogeneralized Pareto distributionParetian tail
An example is a Pareto distribution whose index k satisfies

Moment (mathematics)

momentsmomentraw moment
The second moment of a random variable attains the minimum value when taken around the first moment (i.e., mean) of the random variable, i.e. . Conversely, if a continuous function \varphi satisfies for all random variables X, then it is necessarily of the form, where a > 0.
If the function is a probability distribution, then the zeroth moment is the total probability (i.e. one), the first moment is 0, the second central moment is the variance, the third standardized moment is the skewness, and the fourth standardized moment is the kurtosis.

Law of total variance

total variance
The general formula for variance decomposition or the law of total variance is: If X and Y are two random variables, and the variance of X exists, then
In probability theory, the law of total variance or variance decomposition formula or conditional variance formulas or law of iterated variances also known as Eve's law, states that if X and Y are random variables on the same probability space, and the variance of Y is finite, then

Conditional variance

conditional covariancescedastic function
The conditional expectation of X given Y, and the conditional variance may be understood as follows.
In probability theory and statistics, a conditional variance is the variance of a random variable given the value(s) of one or more other variables.

Invariant (mathematics)

invariantinvariantsinvariance
Variance is invariant with respect to changes in a location parameter.

Probability density function

probability densitydensity functiondensity
The exponential distribution with parameter λ is a continuous distribution whose probability density function is given by If the random variable X has a probability density function f(x), and F(x) is the corresponding cumulative distribution function, then
For instance, the above expression allows for determining statistical characteristics of such a discrete variable (such as its mean, its variance and its kurtosis), starting from the formulas given for a continuous distribution of the probability.