No Results Found!
94 Related Articles

Statistics

statisticalstatistical analysisstatistician
In statistics, a statistic is sufficient with respect to a statistical model and its associated unknown parameter if "no other statistic that can be calculated from the same sample provides any additional information as to the value of the parameter".
He originated the concepts of sufficiency, ancillary statistics, Fisher's linear discriminator and Fisher information.

Kolmogorov structure function

Kolmogorov structure configuration
The Kolmogorov structure function deals with individual finite data; the related notion there is the algorithmic sufficient statistic.
The main properties of an algorithmic sufficient statistic are the following: If S is an algorithmic sufficient statistic for x, then

Statistic

sample statisticempiricalmeasure
In statistics, a statistic is sufficient with respect to a statistical model and its associated unknown parameter if "no other statistic that can be calculated from the same sample provides any additional information as to the value of the parameter".
Important potential properties of statistics include completeness, consistency, sufficiency, unbiasedness, minimum mean square error, low variance, robustness, and computational convenience.

Completeness (statistics)

completecompletenessboundedly complete
If there exists a minimal sufficient statistic, and this is usually the case, then every complete sufficient statistic is necessarily minimal sufficient (note that this statement does not exclude the option of a pathological case in which a complete sufficient exists while there is no minimal sufficient statistic).
It is closely related to the idea of identifiability, but in statistical theory it is often found as a condition imposed on a sufficient statistic from which certain optimality results are derived.

Ronald Fisher

FisherR.A. FisherR. A. Fisher
The concept is due to Sir Ronald Fisher in 1920.
Sufficient statistic, when a statistic is sufficient with respect to a statistical model and its associated unknown parameter if "no other statistic that can be calculated from the same sample provides any additional information as to the value of the parameter".

Normal distribution

normally distributednormalGaussian
For example, for a Gaussian distribution with unknown mean and variance, the jointly sufficient statistic, from which maximum likelihood estimates of both parameters can be estimated, consists of two functions, the sum of all data points and the sum of all squared data points (or equivalently, the sample mean and sample variance).
The statistic is complete and sufficient for μ, and therefore by the Lehmann–Scheffé theorem, is the uniformly minimum variance unbiased (UMVU) estimator.

Minimum-variance unbiased estimator

best unbiased estimatorminimum variance unbiased estimatoruniformly minimum variance unbiased
In fact, the minimum-variance unbiased estimator (MVUE) for θ is
Using the Rao–Blackwell theorem one can also prove that determining the MVUE is simply a matter of finding a complete sufficient statistic for the family and conditioning any unbiased estimator on it.

Sufficient dimension reduction

Sufficient dimension reduction
In statistics, sufficient dimension reduction (SDR) is a paradigm for analyzing data that combines the ideas of dimension reduction with the concept of sufficiency.

Rao–Blackwell theorem

Rao–BlackwellRao–Blackwell procedure
Sufficiency finds a useful application in the Rao–Blackwell theorem, which states that if g(X) is any kind of estimator of θ, then typically the conditional expectation of g(X) given sufficient statistic T(X) is a better estimator of θ, and is never worse.
The Rao–Blackwell theorem states that if g(X) is any kind of estimator of a parameter θ, then the conditional expectation of g(X) given T(X), where T is a sufficient statistic, is typically a better estimator of θ, and is never worse.

Basu's theorem

Basu's theorem on independence of complete sufficient and ancillary statistics
In statistics, Basu's theorem states that any boundedly complete sufficient statistic is independent of any ancillary statistic.

Lehmann–Scheffé theorem

This is the sample maximum, scaled to correct for the bias, and is MVUE by the Lehmann–Scheffé theorem.
The theorem states that any estimator which is unbiased for a given unknown quantity and that depends on the data only through a complete, sufficient statistic is the unique best unbiased estimator of that quantity.

Sample maximum and minimum

sample maximumsample minimumMaximum
If X 1, ...., X n are independent and uniformly distributed on the interval [0,θ], then T(X) = max(X 1, ..., X n ) is sufficient for θ — the sample maximum is a sufficient statistic for the population maximum.
For sampling without replacement from a uniform distribution with one or two unknown endpoints (so 1,2,\dots,N with N unknown, or with both M and N unknown), the sample maximum, or respectively the sample maximum and sample minimum, are sufficient and complete statistics for the unknown endpoints; thus an unbiased estimator derived from these will be UMVU estimator.

Exponential family

exponential familiesnatural parametersnatural statistics
According to the Pitman–Koopman–Darmois theorem, among families of probability distributions whose domain does not vary with the parameter being estimated, only in exponential families is there a sufficient statistic whose dimension remains bounded as sample size increases.
Note that the dimension k of the random variable need not match the dimension d of the parameter vector, nor (in the case of a curved exponential function) the dimension s of the natural parameter and sufficient statistic T(x).

Poisson distribution

PoissonPoisson-distributedPoissonian
If X 1, ...., X n are independent and have a Poisson distribution with parameter λ, then the sum T(X) = X 1 + ... + X n is a sufficient statistic for λ.
To prove sufficiency we may use the factorization theorem.

Ancillary statistic

conditions on ancillary information
Ancillary statistic
Given a statistic T that is not sufficient, an ancillary complement is a statistic U that is ancillary and such that (T, U) is sufficient.

Statistical model

modelprobabilistic modelstatistical modeling
In statistics, a statistic is sufficient with respect to a statistical model and its associated unknown parameter if "no other statistic that can be calculated from the same sample provides any additional information as to the value of the parameter".

Parameter

parametersparametricparametrization
In statistics, a statistic is sufficient with respect to a statistical model and its associated unknown parameter if "no other statistic that can be calculated from the same sample provides any additional information as to the value of the parameter".

Sample (statistics)

samplesamplesstatistical sample
In statistics, a statistic is sufficient with respect to a statistical model and its associated unknown parameter if "no other statistic that can be calculated from the same sample provides any additional information as to the value of the parameter".

Parametric family

parameterized familyparametrized familyfamily
In particular, a statistic is sufficient for a family of probability distributions if the sample from which it is calculated gives no additional information than does the statistic, as to which of those probability distributions is that of the population from which the sample was taken.

Probability distribution

distributioncontinuous probability distributiondiscrete probability distribution
In particular, a statistic is sufficient for a family of probability distributions if the sample from which it is calculated gives no additional information than does the statistic, as to which of those probability distributions is that of the population from which the sample was taken.

Descriptive statistics

descriptivedescriptive statisticstatistics
Stephen Stigler noted 1973 that the concept of sufficiency had fallen out of favor in descriptive statistics because of the strong dependence on an assumption of the distributional form (see Pitman–Koopman–Darmois theorem below), but remained very important in theoretical work.

Independent and identically distributed random variables

independent and identically distributedi.i.d.iid
Roughly, given a set \mathbf{X} of independent identically distributed data conditioned on an unknown parameter \theta, a sufficient statistic is a function whose value contains all the information needed to compute any estimate of the parameter (e.g. a maximum likelihood estimate).

Euclidean vector

vectorvectorsvector addition
More generally, the "unknown parameter" may represent a vector of unknown quantities or may represent everything about the model that is unknown or not fully specified.

Mean

mean valuepopulation meanaverage
For example, for a Gaussian distribution with unknown mean and variance, the jointly sufficient statistic, from which maximum likelihood estimates of both parameters can be estimated, consists of two functions, the sum of all data points and the sum of all squared data points (or equivalently, the sample mean and sample variance).

Sample mean and covariance

sample meansample covariancesample covariance matrix
For example, for a Gaussian distribution with unknown mean and variance, the jointly sufficient statistic, from which maximum likelihood estimates of both parameters can be estimated, consists of two functions, the sum of all data points and the sum of all squared data points (or equivalently, the sample mean and sample variance).