Bayes estimator

BayesianBayesian decision theoryBayesian estimatorBayesian estimationBayes riskasymptoticly efficientBayesBayes actionBayes ResponseBayes rule
In estimation theory and decision theory, a Bayes estimator or a Bayes action is an estimator or decision rule that minimizes the posterior expected value of a loss function (i.e., the posterior expected loss).wikipedia
89 Related Articles

Estimation theory

parameter estimationestimationestimated
In estimation theory and decision theory, a Bayes estimator or a Bayes action is an estimator or decision rule that minimizes the posterior expected value of a loss function (i.e., the posterior expected loss).

Conjugate prior

conjugateconjugate distributionconjugate prior distribution
If there is no inherent reason to prefer one prior probability distribution over another, a conjugate prior is sometimes chosen for simplicity.
The concept, as well as the term "conjugate prior", were introduced by Howard Raiffa and Robert Schlaifer in their work on Bayesian decision theory.

Decision rule

rule for making a decision
In estimation theory and decision theory, a Bayes estimator or a Bayes action is an estimator or decision rule that minimizes the posterior expected value of a loss function (i.e., the posterior expected loss).

Maximum a posteriori estimation

maximum a posterioriMAPposterior mode
An alternative way of formulating an estimator within Bayesian statistics is maximum a posteriori estimation.
as c goes to 0, the Bayes estimator approaches the MAP estimator, provided that the distribution of \theta is quasi-concave.

Admissible decision rule

admissibleadmissibilityinadmissible
that minimizes is called a Bayes rule with respect to.

Maximum likelihood estimation

maximum likelihoodmaximum likelihood estimatormaximum likelihood estimate
and variance of the marginal distribution of using the maximum likelihood approach:
A maximum likelihood estimator coincides with the most probable Bayesian estimator given a uniform prior distribution on the parameters.

Empirical Bayes method

empirical BayesEmpirical Bayes methodsEmpirical Bayesian
A Bayes estimator derived through the empirical Bayes method is called an empirical Bayes estimator.

Decision theory

decision sciencestatistical decision theorydecision sciences
In estimation theory and decision theory, a Bayes estimator or a Bayes action is an estimator or decision rule that minimizes the posterior expected value of a loss function (i.e., the posterior expected loss).

Expected value

expectationexpectedmean
In estimation theory and decision theory, a Bayes estimator or a Bayes action is an estimator or decision rule that minimizes the posterior expected value of a loss function (i.e., the posterior expected loss).

Loss function

objective functioncost functionrisk function
In estimation theory and decision theory, a Bayes estimator or a Bayes action is an estimator or decision rule that minimizes the posterior expected value of a loss function (i.e., the posterior expected loss).

Utility

utility functionutility theoryutilities
Equivalently, it maximizes the posterior expectation of a utility function.

Bayesian statistics

BayesianBayesian methodsBayesian analysis
An alternative way of formulating an estimator within Bayesian statistics is maximum a posteriori estimation.

Posterior probability

posterior distributionposteriorposterior probability distribution
In estimation theory and decision theory, a Bayes estimator or a Bayes action is an estimator or decision rule that minimizes the posterior expected value of a loss function (i.e., the posterior expected loss).

Parametric family

parameterized familyfamilyparametrized family
A conjugate prior is defined as a prior distribution belonging to some parametric family, for which the resulting posterior distribution also belongs to the same family.

Independent and identically distributed random variables

independent and identically distributedi.i.d.iid
Let θ be an unknown random variable, and suppose that are iid samples with density.

Mean squared error

mean square errorsquared error lossMSE
The most common risk function used for Bayesian estimation is the mean square error (MSE), also called squared error risk.

Robust statistics

robustbreakdown pointrobustness
Other loss functions are used in statistics, particularly in robust statistics.

Prior probability

prior distributionpriorprior probabilities
Suppose an unknown parameter \theta is known to have a prior distribution \pi. :Such measures p(\theta), which are not probability distributions, are referred to as improper priors.

Measure (mathematics)

measuremeasure theorymeasurable
:Such measures p(\theta), which are not probability distributions, are referred to as improper priors.

Bayes' theorem

Bayes' ruleBayes theoremBayes's theorem
:This is a definition, and not an application of Bayes' theorem, since Bayes' theorem can only be applied when all distributions are proper.