Invariant estimator

equivariantequivariant estimationInvarianceinvariantPitman estimator
In statistics, the concept of being an invariant estimator is a criterion that can be used to compare the properties of different estimators for the same quantity.wikipedia
33 Related Articles

Equivariant map

equivariantintertwining operatorintertwiner
The term equivariant estimator is used in formal mathematical contexts that include a precise description of the relation of the way the estimator changes in response to changes to the dataset and parameterisation: this corresponds to the use of "equivariance" in more general mathematics.
In statistical inference, equivariance under statistical transformations of data is an important property of various estimation methods; see invariant estimator for details.

Estimator

estimatorsestimateestimates
In statistics, the concept of being an invariant estimator is a criterion that can be used to compare the properties of different estimators for the same quantity.
Invariant estimator

Location parameter

locationlocation modelshift parameter
Shift invariance: Notionally, estimates of a location parameter should be invariant to simple shifts of the data values. If all data values are increased by a given amount, the estimate should change by the same amount. When considering estimation using a weighted average, this invariance requirement immediately implies that the weights should sum to one. While the same result is often derived from a requirement for unbiasedness, the use of "invariance" does not require that a mean value exists and makes no use of any probability distribution at all.
Invariant estimator

Loss function

objective functioncost functionrisk function
The problem is to estimate \theta given x. The estimate, denoted by a, is a function of the measurements and belongs to a set A. The quality of the result is defined by a loss function which determines a risk function.
Invariance: Choose the optimal decision rule which satisfies an invariance requirement.

Statistics

statisticalstatistical analysisstatistician
In statistics, the concept of being an invariant estimator is a criterion that can be used to compare the properties of different estimators for the same quantity.

Statistical inference

inferenceinferential statisticsinferences
In statistical inference, there are several approaches to estimation theory that can be used to decide immediately what estimators should be used according to those approaches.

Estimation theory

parameter estimationestimationestimated
In statistical inference, there are several approaches to estimation theory that can be used to decide immediately what estimators should be used according to those approaches.

Bayesian inference

BayesianBayesian analysisBayesian methods
For example, ideas from Bayesian inference would lead directly to Bayesian estimators.

Bayes estimator

BayesianBayesian estimationBayes
For example, ideas from Bayesian inference would lead directly to Bayesian estimators.

Statistical model

modelprobabilistic modelstatistical modeling
However, the usefulness of these theories depends on having a fully prescribed statistical model and may also depend on having a relevant loss function to determine the estimator.

Robust statistics

robustbreakdown pointrobustness
In addition to these cases where general theory does not prescribe an estimator, the concept of invariance of an estimator can be applied when seeking estimators of alternative forms, either for the sake of simplicity of application of the estimator or so that the estimator is robust.

Bias of an estimator

unbiasedunbiased estimatorbias
For example, a requirement of invariance may be incompatible with the requirement that the estimator be mean-unbiased; on the other hand, the criterion of median-unbiasedness is defined in terms of the estimator's sampling distribution and so is invariant under many transformations.

Median

averagesample medianmedian-unbiased estimator
For example, a requirement of invariance may be incompatible with the requirement that the estimator be mean-unbiased; on the other hand, the criterion of median-unbiasedness is defined in terms of the estimator's sampling distribution and so is invariant under many transformations.

Independent and identically distributed random variables

independent and identically distributedi.i.d.iid
Permutation invariance: Where a set of data values can be represented by a statistical model that they are outcomes from independent and identically distributed random variables, it is reasonable to impose the requirement that any estimator of any property of the common distribution should be permutation-invariant: specifically that the estimator, considered as a function of the set of data-values, should not change if items of data are swapped within the dataset.

Weighted arithmetic mean

averageaverage ratingweighted average
Shift invariance: Notionally, estimates of a location parameter should be invariant to simple shifts of the data values. If all data values are increased by a given amount, the estimate should change by the same amount. When considering estimation using a weighted average, this invariance requirement immediately implies that the weights should sum to one. While the same result is often derived from a requirement for unbiasedness, the use of "invariance" does not require that a mean value exists and makes no use of any probability distribution at all.

Scale invariance

scale invariantscale-invariantscaling
Scale invariance: Note that this topic about the invariance of the estimator scale parameter not to be confused with the more general scale invariance about the behavior of systems under aggregate properties (in physics).

Maximum likelihood estimation

maximum likelihoodmaximum likelihood estimatormaximum likelihood estimate
Parameter-transformation invariance: Here, the transformation applies to the parameters alone. The concept here is that essentially the same inference should be made from data and a model involving a parameter θ as would be made from the same data if the model used a parameter φ, where φ is a one-to-one transformation of θ, φ=h. According to this type of invariance, results from transformation-invariant estimators should also be related by φ=h. Maximum likelihood estimators have this property when the transformation is monotonic. Though the asymptotic properties of the estimator might be invariant, the small sample properties can be different, and a specific distribution needs to be derived.

Random variable

random variablesrandom variationrandom
Permutation invariance: Where a set of data values can be represented by a statistical model that they are outcomes from independent and identically distributed random variables, it is reasonable to impose the requirement that any estimator of any property of the common distribution should be permutation-invariant: specifically that the estimator, considered as a function of the set of data-values, should not change if items of data are swapped within the dataset.

Multivariate random variable

random vectorvectormultivariate
The measurements x are modelled as a vector random variable having a probability density function f(x|\theta) which depends on a parameter vector \theta.

Probability density function

probability densitydensity functiondensity
The measurements x are modelled as a vector random variable having a probability density function f(x|\theta) which depends on a parameter vector \theta.

Statistical classification

classificationclassifierclassifiers
In statistical classification, the rule which assigns a class to a new data-item can be considered to be a special type of estimator.

Prior knowledge for pattern recognition

prior knowledge
A number of invariance-type considerations can be brought to bear in formulating prior knowledge for pattern recognition.

Equivalence class

equivalence classesquotient setquotient
Datasets x_1 and x_2 in X are equivalent if x_1=g(x_2) for some g\in G. All the equivalent points form an equivalence class.

Group action

actionorbitacts
A group of transformations of X, to be denoted by G, is a set of (measurable) 1:1 and onto transformations of X into itself, which satisfies the following conditions:

Multivariate normal distribution

multivariate normalbivariate normal distributionjointly normally distributed
a multivariate normal distribution with independent, unit-variance components) then