Completeness (statistics)

completecompletenessboundedly completecomplete class theoremscomplete statistic
In statistics, completeness is a property of a statistic in relation to a model for a set of observed data.wikipedia
35 Related Articles

Statistic

sample statisticempiricalmeasure
In statistics, completeness is a property of a statistic in relation to a model for a set of observed data. Say T is statistic; that is, the composition of a measurable function with a random sample X 1,...,X n.
Important potential properties of statistics include completeness, consistency, sufficiency, unbiasedness, minimum mean square error, low variance, robustness, and computational convenience.

Normal distribution

normally distributednormalGaussian
This example will show that, in a sample X 1, X 2 of size 2 from a normal distribution with known variance, the statistic X 1 + X 2 is complete and sufficient. Suppose (X 1, X 2 ) are independent, identically distributed random variables, normally distributed with expectation θ and variance 1.
The statistic is complete and sufficient for μ, and therefore by the Lehmann–Scheffé theorem, is the uniformly minimum variance unbiased (UMVU) estimator.

Basu's theorem

Bounded completeness occurs in Basu's theorem, which states that a statistic that is both boundedly complete and sufficient is independent of any ancillary statistic.
In statistics, Basu's theorem states that any boundedly complete sufficient statistic is independent of any ancillary statistic.

Lehmann–Scheffé theorem

Completeness occurs in the Lehmann–Scheffé theorem,
The theorem states that any estimator which is unbiased for a given unknown quantity and that depends on the data only through a complete, sufficient statistic is the unique best unbiased estimator of that quantity.

Minimum-variance unbiased estimator

best unbiased estimatorminimum variance unbiased estimatoruniformly minimum variance unbiased
See also minimum-variance unbiased estimator.
Using the Rao–Blackwell theorem one can also prove that determining the MVUE is simply a matter of finding a complete sufficient statistic for the family and conditioning any unbiased estimator on it.

Sufficient statistic

sufficient statisticssufficientsufficiency
Bounded completeness occurs in Basu's theorem, which states that a statistic that is both boundedly complete and sufficient is independent of any ancillary statistic. It is closely related to the idea of identifiability, but in statistical theory it is often found as a condition imposed on a sufficient statistic from which certain optimality results are derived. In the case where there exists at least one minimal sufficient statistic, a statistic which is sufficient and boundedly complete, is necessarily minimal sufficient.
If there exists a minimal sufficient statistic, and this is usually the case, then every complete sufficient statistic is necessarily minimal sufficient (note that this statement does not exclude the option of a pathological case in which a complete sufficient exists while there is no minimal sufficient statistic).

Statistics

statisticalstatistical analysisstatistician
In statistics, completeness is a property of a statistic in relation to a model for a set of observed data.

Identifiability

identifiablenonidentifiableidentifiability condition
It is closely related to the idea of identifiability, but in statistical theory it is often found as a condition imposed on a sufficient statistic from which certain optimality results are derived.

Statistical theory

statisticalstatistical theoriesmathematical statistics
It is closely related to the idea of identifiability, but in statistical theory it is often found as a condition imposed on a sufficient statistic from which certain optimality results are derived.

Random variable

random variablesrandom variationrandom
Consider a random variable X whose probability distribution belongs to a parametric model P θ parametrized by θ.

Parametric model

parametricregular parametric modelparameter
Consider a random variable X whose probability distribution belongs to a parametric model P θ parametrized by θ.

Measurable function

measurableLebesgue measurableΣ-measurable
Say T is statistic; that is, the composition of a measurable function with a random sample X 1,...,X n.

Sampling (statistics)

samplingrandom samplesample
Let X be a random sample of size n such that each X i has the same Bernoulli distribution with parameter p.

Bernoulli distribution

BernoulliBernoulli random variableBernoulli random variables
Let X be a random sample of size n such that each X i has the same Bernoulli distribution with parameter p.

Binomial distribution

binomialbinomial probability distributionbinomial random variable
T is a statistic of X which has a binomial distribution with parameters (n,p). If the parameter space for p is (0,1), then T is a complete statistic.

Positive real numbers

positive realspositive real axislogarithmic measure
First, observe that the range of r is the positive reals.

Polynomial

polynomial functionpolynomialsmultivariate polynomial
Also, E(g(T)) is a polynomial in r and, therefore, can only be identical to 0 if all coefficients are 0, that is, g(t) = 0 for all t.

Independence (probability theory)

independentstatistically independentindependence
Bounded completeness occurs in Basu's theorem, which states that a statistic that is both boundedly complete and sufficient is independent of any ancillary statistic. Suppose (X 1, X 2 ) are independent, identically distributed random variables, normally distributed with expectation θ and variance 1.

Laplace transform

Laplaces-domain
As a function of θ this is a two-sided Laplace transform of h(X), and cannot be identically zero unless h(x) is zero almost everywhere.

Raghu Raj Bahadur

BahadurBahadur, Raghu Raj
(A case in which there is no minimal sufficient statistic was shown by Bahadur in 1957.

Convex function

convexconvexitystrictly convex
In other words, this statistic has a smaller expected loss for any convex loss function; in many practical applications with the squared loss-function, it has a smaller mean squared error among any estimators with the same expected value.

Expected value

expectationexpectedmean
In other words, this statistic has a smaller expected loss for any convex loss function; in many practical applications with the squared loss-function, it has a smaller mean squared error among any estimators with the same expected value.

Ancillary statistic

conditions on ancillary information
Bounded completeness occurs in Basu's theorem, which states that a statistic that is both boundedly complete and sufficient is independent of any ancillary statistic.

Necessity and sufficiency

necessary conditionnecessary and sufficient conditionsufficient condition
In the case where there exists at least one minimal sufficient statistic, a statistic which is sufficient and boundedly complete, is necessarily minimal sufficient.