# Cumulant

cumulant generating functioncumulantscumulant-generating functioncumulants of some discrete probability distributionscumulant analysiscumulant functioncumulant generating functionsCumulants of particular probability distributionsexpressing the moments as functions of the cumulantsfree cumulant
In probability theory and statistics, the cumulants of a probability distribution are a set of quantities that provide an alternative to the moments of the distribution.wikipedia
152 Related Articles

### Variance

sample variancepopulation variancevariability
The first cumulant is the mean, the second cumulant is the variance, and the third cumulant is the same as the third central moment.
The variance is also equivalent to the second cumulant of a probability distribution that generates X. The variance is typically designated as, \sigma^2_X, or simply \sigma^2 (pronounced "sigma squared").

### Normal distribution

normally distributednormalGaussian
As well, the third and higher-order cumulants of a normal distribution are zero, and it is the only distribution with this property.
The normal distribution is the only absolutely continuous distribution whose cumulants beyond the first two (i.e., other than the mean and variance) are zero.

### Geometric distribution

geometricgeometrically distributedgeometric random variable
* The geometric distributions, (number of failures before one success with probability
Then the cumulants \kappa_n of the probability distribution of Y satisfy the recursion

### Moment (mathematics)

momentsmomentraw moment
In probability theory and statistics, the cumulants of a probability distribution are a set of quantities that provide an alternative to the moments of the distribution.
The kurtosis is defined to be the normalised fourth central moment minus 3 (Equivalently, as in the next section, it is the fourth cumulant divided by the square of the variance).

### Poisson distribution

PoissonPoisson-distributedPoissonian
* The Poisson distributions.
All of the cumulants of the Poisson distribution are equal to the expected value λ. The nth factorial moment of the Poisson distribution is λ n.

### Uniform distribution (continuous)

uniform distributionuniformuniformly distributed
The cumulants of the uniform distribution on the interval [−1, 0] are κ n = B n /n, where B n is the n-th Bernoulli number.
For n ≥ 2, the nth cumulant of the uniform distribution on the interval [-1/2, 1/2] is b n /n, where b n is the nth Bernoulli number.

### Negative binomial distribution

negative binomialgamma-Poisson distributioninverse binomial distribution
* The negative binomial distributions, (number of failures before n successes with probability p of success on each trial).
See Cumulants of some discrete probability distributions.

### Natural exponential family

variance functionnatural exponential familiesnatural exponential family with quadratic variance functions
The natural exponential family of a distribution may be realized by shifting or translating K(t), and adjusting it vertically so that it always passes through the origin: if f is the pdf with cgf and f|\theta is its natural exponential family, then and
The cumulant generating function is by definition the logarithm of the MGF, so it is

### Index of dispersion

variance-to-mean ratiocoefficient of dispersionrelative variance
Introducing the variance-to-mean ratio
This can be considered analogous to the classification of conic sections by eccentricity; see Cumulants of particular probability distributions for details.

### Characteristic function (probability theory)

characteristic functioncharacteristic functionscharacteristic function:
Some writers prefer to define the cumulant-generating function as the natural logarithm of the characteristic function, which is sometimes also called the second characteristic function,
The logarithm of a characteristic function is a cumulant generating function, which is useful for finding cumulants; some instead define the cumulant generating function as the logarithm of the moment-generating function, and call the logarithm of the characteristic function the second cumulant generating function.

### Faà di Bruno's formula

derivatives of a composition of functions
The explicit expression for the n-th moment in terms of the first n cumulants, and vice versa, can be obtained by using Faà di Bruno's formula for higher derivatives of composite functions.
These coefficients also arise in the Bell polynomials, which are relevant to the study of cumulants.

### Eccentricity (mathematics)

eccentricityeccentriceccentricities
. Note the analogy to the classification of conic sections by eccentricity: circles
Classification of discrete distributions by variance-to-mean ratio; see cumulants of some discrete probability distributions for details.

### Conic section

conicconic sectionsconics
. Note the analogy to the classification of conic sections by eccentricity: circles
Variance-to-mean ratio: The variance-to-mean ratio classifies several important families of discrete probability distributions: the constant distribution as circular (eccentricity 0), binomial distributions as elliptical, Poisson distributions as parabolic, and negative binomial distributions as hyperbolic. This is elaborated at cumulants of some discrete probability distributions.

### Harold Hotelling

HotellingHotelling, HaroldH. Hotelling
Stephen Stigler has said that the name cumulant was suggested to Fisher in a letter from Harold Hotelling.
Hotelling suggested that Fisher use the English word "cumulants" for Thiele's Danish "semi-invariants".

### Thorvald N. Thiele

Thiele, Thorvald N.ThieleT. N. Thiele
Cumulants were first introduced by Thorvald N. Thiele, in 1889, who called them semi-invariants.
Thiele was the first to propose a mathematical theory of Brownian motion, introduced the cumulants and likelihood functions, and was considered to be of the greatest statisticians of all time by Ronald Fisher.

### Ursell function

connected correlation function
In statistical mechanics, cumulants are also known as Ursell functions relating to a publication in 1927.
In statistical mechanics, an Ursell function or connected correlation function, is a cumulant of

### Standardized moment

standardized central moments
as functions of the standardized central moments, also set
For skewness and kurtosis, alternative definitions exist, which are based on the third and fourth cumulant respectively.

### Central moment

The first cumulant is the mean, the second cumulant is the variance, and the third cumulant is the same as the third central moment.
A related functional that shares the translation-invariance and homogeneity properties with the nth central moment, but continues to have this additivity property even when n ≥ 4 is the nth cumulant κ n (X).

### Bell polynomials

Bell polynomialcomplete Bell polynomialexponential Bell polynomials
where B_{n,k} are incomplete (or partial) Bell polynomials.
is the nth raw moment of a probability distribution whose first n cumulants are κ 1, ..., κ n.

### Moment-generating function

moment generating functionCalculations of momentsgenerating functions
, which is the natural logarithm of the moment-generating function:
Cumulant-generating function: The cumulant-generating function is defined as the logarithm of the moment-generating function; some instead define the cumulant-generating function as the logarithm of the characteristic function, while others call this latter the second cumulant-generating function.

### Edgeworth series

Edgeworth approximationexpansionGram–Charlier probability function
A distribution with given cumulants can be approximated through an Edgeworth series.
The Gram–Charlier A series (named in honor of Jørgen Pedersen Gram and Carl Charlier), and the Edgeworth series (named in honor of Francis Ysidro Edgeworth) are series that approximate a probability distribution in terms of its cumulants.

### Anders Hald

Hald, AndersA. Hald
The history of cumulants is discussed by Anders Hald.
"The early history of the cumulants and the Gram–Charlier series", International Statistical Review, 68, number 2 (2000): 137–153. (Reprinted in )

### Law of total variance

The law of total expectation and the law of total variance generalize naturally to conditional cumulants.
For higher cumulants, a generalization exists.

### Umbral calculus

umbralumbral notation
Further connection between cumulants and combinatorics can be found in the work of Gian-Carlo Rota and Jianhong (Jackie) Shen, where links to invariant theory, symmetric functions, and binomial sequences are studied via umbral calculus.
Rota later applied umbral calculus extensively in his paper with Shen to study the various combinatorial properties of the cumulants.

### Cornish–Fisher expansion

Cornish–Fisher expansion
The Cornish–Fisher expansion is an asymptotic expansion used to approximate the quantiles of a probability distribution based on its cumulants.