# Fisher information

**Fisher information matrixinformation matrixinformationsingular statistical modelexpected information matrixFisher's informationobserved information per observation**

In mathematical statistics, the Fisher information (sometimes simply called information ) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X.wikipedia

144 Related Articles

### Observed information

**observed information matrixobserved**

Formally, it is the variance of the score, or the expected value of the observed information.

It is a sample-based version of the Fisher information.

### Jeffreys prior

**following JeffreysJeffreys' prioruninformative**

The Fisher information is also used in the calculation of the Jeffreys prior, which is used in Bayesian statistics.

In Bayesian probability, the Jeffreys prior, named after Sir Harold Jeffreys, is a non-informative (objective) prior distribution for a parameter space; it is proportional to the square root of the determinant of the Fisher information matrix:

### Score (statistics)

**scorescore functionscoring**

Formally, it is the variance of the score, or the expected value of the observed information.

:The latter is known as the Fisher information and is written.

### Cramér–Rao bound

**Cramér–Rao inequalityCramér–Rao lower bound**

The Cramér–Rao bound states that the inverse of the Fisher information is a lower bound on the variance of any unbiased estimator of θ. H.L. Van Trees (1968) and B. Roy Frieden (2004) provide the following method of deriving the Cramér–Rao bound, a result which describes use of the Fisher information.

In its simplest form, the bound states that the variance of any unbiased estimator is at least as high as the inverse of the Fisher information.

### B. Roy Frieden

**Frieden, B. Roy**

H.L. Van Trees (1968) and B. Roy Frieden (2004) provide the following method of deriving the Cramér–Rao bound, a result which describes use of the Fisher information.

Frieden is best known for his extensive work on Fisher information as a grounding principle for deriving and elaborating physical theory.

### Fisher information metric

**Fisher metricThermodynamic length**

The topic information geometry uses this to connect Fisher information to differential geometry, and in that context, this metric is known as the Fisher information metric.

Considered purely as a matrix, it is known as the Fisher information matrix.

### Ronald Fisher

**R.A. FisherR. A. FisherFisher**

The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician Ronald Fisher (following some initial results by Francis Ysidro Edgeworth).

### Least squares

**least-squaresmethod of least squaresleast squares method**

In this case the Fisher information matrix may be identified with the coefficient matrix of the normal equations of least squares estimation theory.

Also, by iteratively applying local quadratic approximation to the likelihood (through the Fisher information), the least-squares method may be used to fit a generalized linear model.

### Support curve

:Thus, the Fisher information may be seen as the curvature of the support curve (the graph of the log-likelihood).

The function being plotted is used in the computation of the score and Fisher information, and the graph has a direct interpretation in the context of maximum likelihood estimation and likelihood-ratio tests.

### Maximum likelihood estimation

**maximum likelihoodmaximum likelihood estimatormaximum likelihood estimate**

The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician Ronald Fisher (following some initial results by Francis Ysidro Edgeworth).

is the Fisher information matrix.

### Optimal design

**optimal experimental designoptimaloptimal design of experiments**

Fisher information is widely used in optimal experimental design.

In the estimation theory for statistical models with one real parameter, the reciprocal of the variance of an ("efficient") estimator is called the "Fisher information" for that estimator.

### Multivariate normal distribution

**multivariate normalbivariate normal distributionjointly normally distributed**

The FIM for a N-variate multivariate normal distribution, has a special form.

The Fisher information matrix for estimating the parameters of a multivariate normal distribution has a closed form expression.

### Efficiency (statistics)

**efficientefficiencyinefficient**

where is the Fisher information of the sample.

### Formation matrix

In statistics and information theory, the expected formation matrix of a likelihood function L(\theta) is the matrix inverse of the Fisher information matrix of L(\theta), while the observed formation matrix of L(\theta) is the inverse of the observed information matrix of L(\theta).

### Entropy (information theory)

**entropyinformation entropyShannon entropy**

Similar to the entropy or mutual information, the Fisher information also possesses a chain rule decomposition.

### Statistic

**sample statisticempiricalmeasure**

More generally, if is a statistic, then

The most common is the Fisher information, which is defined on the statistic model induced by the statistic.

### Information theory

**information-theoreticinformation theoristinformation**

### Mathematical statistics

**mathematical statisticianstatistics mathematical statistics**

In mathematical statistics, the Fisher information (sometimes simply called information ) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X.

### Information

**informativeinputinputs**

In mathematical statistics, the Fisher information (sometimes simply called information ) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X.

### Random variable

**random variablesrandom variationrandom**

In mathematical statistics, the Fisher information (sometimes simply called information ) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X.

### Variance

**sample variancepopulation variancevariability**

Formally, it is the variance of the score, or the expected value of the observed information.

### Expected value

**expectationexpectedmean**

Formally, it is the variance of the score, or the expected value of the observed information.

### Bayesian statistics

**BayesianBayesian methodsBayesian analysis**

In Bayesian statistics, the asymptotic distribution of the posterior mode depends on the Fisher information and not on the prior (according to the Bernstein–von Mises theorem, which was anticipated by Laplace for exponential families).

### Asymptotic distribution

**asymptotically normalasymptotic normalitylimiting distribution**

In Bayesian statistics, the asymptotic distribution of the posterior mode depends on the Fisher information and not on the prior (according to the Bernstein–von Mises theorem, which was anticipated by Laplace for exponential families).

### Posterior probability

**posterior distributionposteriorposterior probability distribution**

In Bayesian statistics, the asymptotic distribution of the posterior mode depends on the Fisher information and not on the prior (according to the Bernstein–von Mises theorem, which was anticipated by Laplace for exponential families).