Consistent estimator
consistentconsistencyinconsistentstatistically consistentasymptotically consistentconsistent estimatorsconsistentlyconsistently estimatestatistical consistency
In statistics, a consistent estimator or asymptotically consistent estimator is an estimator—a rule for computing estimates of a parameter θ 0 —having the property that as the number of data points used increases indefinitely, the resulting sequence of estimates converges in probability to θ 0.wikipedia
90 Related Articles
Estimator
estimatorsestimateestimates
In statistics, a consistent estimator or asymptotically consistent estimator is an estimator—a rule for computing estimates of a parameter θ 0 —having the property that as the number of data points used increases indefinitely, the resulting sequence of estimates converges in probability to θ 0.
The attractiveness of different estimators can be judged by looking at their properties, such as unbiasedness, mean square error, consistency, asymptotic distribution, etc. The construction and comparison of estimators are the subjects of the estimation theory.
Statistics
statisticalstatistical analysisstatistician
In statistics, a consistent estimator or asymptotically consistent estimator is an estimator—a rule for computing estimates of a parameter θ 0 —having the property that as the number of data points used increases indefinitely, the resulting sequence of estimates converges in probability to θ 0.
While the tools of data analysis work best on data from randomized studies, they are also applied to other kinds of data—like natural experiments and observational studies —for which a statistician would use a modified, more structured estimation method (e.g., Difference in differences estimation and instrumental variables, among many others) that produce consistent estimators.





Convergence of random variables
convergence in distributionconverges in distributionconvergence in probability
In statistics, a consistent estimator or asymptotically consistent estimator is an estimator—a rule for computing estimates of a parameter θ 0 —having the property that as the number of data points used increases indefinitely, the resulting sequence of estimates converges in probability to θ 0.
For example, an estimator is called consistent if it converges in probability to the quantity being estimated.
Normal distribution
normally distributedGaussian distributionnormal
Suppose one has a sequence of observations {X 1, X 2, ...} from a normal N(μ, σ 2 ) distribution.
From the standpoint of the asymptotic theory, is consistent, that is, it converges in probability to μ as n → ∞.






Bias of an estimator
unbiasedunbiased estimatorbias
Consistency is related to bias; see bias versus consistency.
Bias is related to consistency in that consistent estimators are convergent and asymptotically unbiased (hence converge to the correct value as the number of data points grows arbitrarily large), though individual estimators in a consistent sequence may be biased (so long as the bias converges to zero); see bias versus consistency.
Extremum estimator
* If estimator T n is defined implicitly, for example as a value that maximizes certain objective function (see extremum estimator), then a more complicated argument involving stochastic equicontinuity has to be used.
If these conditions are satisfied then is consistent for θ 0.
Standard deviation
standard deviationssample standard deviationSD
Important examples include the sample variance and sample standard deviation.
This is a consistent estimator (it converges in probability to the population value as the number of samples goes to infinity), and is the maximum-likelihood estimate when the population is normally distributed.
Variance
sample variancepopulation variancevariability
Important examples include the sample variance and sample standard deviation.
The simplest estimators for population mean and population variance are simply the mean and variance of the sample, the sample mean and (uncorrected) sample variance – these are consistent estimators (they converge to the correct value as the number of samples increases), but can be improved.
Fisher consistency
Fisher consistent
The term consistency in statistics usually refers to an estimator that is asymptotically consistent.
Efficient estimator
efficientEfficiencyefficient estimators
Sample size determination
sample sizeSampling sizessample
In practice one constructs an estimator as a function of an available sample of size n, and then imagines being able to keep collecting data and expanding the sample ad infinitum.
Parametric model
parametricregular parametric modelparameters
Suppose {p θ : θ ∈ Θ} is a family of distributions (the parametric model), and X θ = {X 1, X 2, … : X i ~ p θ } is an infinite sample from the distribution p θ .
Sample (statistics)
samplesamplesstatistical sample
Suppose {p θ : θ ∈ Θ} is a family of distributions (the parametric model), and X θ = {X 1, X 2, … : X i ~ p θ } is an infinite sample from the distribution p θ .

Sample mean and covariance
sample meansample covariancesample covariance matrix
To estimate μ based on the first n observations, one can use the sample mean: T n = (X 1 + ... + X n )/n.
Sampling distribution
finite sample distributiondistributionsampling
From the properties of the normal distribution, we know the sampling distribution of this statistic: T n is itself normally distributed, with mean μ and variance σ 2 /n.
Cumulative distribution function
distribution functionCDFcumulative probability distribution function
Therefore, the sequence T n of sample means is consistent for the population mean μ (recalling that \Phi is the cumulative distribution of the normal distribution).

Stochastic equicontinuity
stochastically equicontinuous
* If estimator T n is defined implicitly, for example as a value that maximizes certain objective function (see extremum estimator), then a more complicated argument involving stochastic equicontinuity has to be used.
Law of large numbers
strong law of large numbersweak law of large numbersBernoulli's Golden Theorem


Independent and identically distributed random variables
independent and identically distributedi.i.d.iid
For example, for an iid sample {x,..., x} one can use T(X) = x as the estimator of the mean E[x].
Bessel's correction
Bessel-correctedBessel corrected variance
Without Bessel's correction (that is, when using the sample size n instead of the degrees of freedom n-1), these are both negatively biased but consistent estimators.
Degrees of freedom (statistics)
degrees of freedomdegree of freedomEffective degrees of freedom
Without Bessel's correction (that is, when using the sample size n instead of the degrees of freedom n-1), these are both negatively biased but consistent estimators.
Statistical hypothesis testing
hypothesis testingstatistical teststatistical tests