Wald test

WaldWald estimatorWald statisticWald statistics
In statistics, the Wald test (named after Abraham Wald) assesses constraints on statistical parameters based on the weighted distance between the unrestricted estimate and its hypothesized value under the null hypothesis, where the weight is the precision of the estimate.wikipedia
56 Related Articles

Likelihood-ratio test

likelihood ratio testlikelihood ratiolikelihood-ratio
Together with the Lagrange multiplier and the likelihood-ratio test, the Wald test is one of three classical approaches to hypothesis testing.
The likelihood-ratio test is the oldest of the three classical approaches to hypothesis testing, together with the Lagrange multiplier test and the Wald test.

Abraham Wald

WaldWald, Abraham
In statistics, the Wald test (named after Abraham Wald) assesses constraints on statistical parameters based on the weighted distance between the unrestricted estimate and its hypothesized value under the null hypothesis, where the weight is the precision of the estimate.

Score test

Lagrange multiplier testLagrange multiplierLagrange multiplier (LM) test
Together with the Lagrange multiplier and the likelihood-ratio test, the Wald test is one of three classical approaches to hypothesis testing.
The main advantage of the score test over the Wald test and likelihood-ratio test is that the LM test only requires the computation of the restricted estimator.

Structural break

Sup-LR testStructural break testSup-LM test
For cases 1 and 2, the sup-Wald (i.e., the supremum of a set of Wald statistics), sup-LM (i.e., the supremum of a set of Lagrange multiplier statistics), and sup-LR (i.e., the supremum of a set of likelihood ratio statistics) tests developed by Andrews (1993, 2003) may be used to test for parameter instability when the number and location of structural breaks are unknown.

Statistics

statisticalstatistical analysisstatistician
In statistics, the Wald test (named after Abraham Wald) assesses constraints on statistical parameters based on the weighted distance between the unrestricted estimate and its hypothesized value under the null hypothesis, where the weight is the precision of the estimate.

Constraint (mathematics)

constraintconstraintsconstrained
In statistics, the Wald test (named after Abraham Wald) assesses constraints on statistical parameters based on the weighted distance between the unrestricted estimate and its hypothesized value under the null hypothesis, where the weight is the precision of the estimate.

Statistical parameter

parametersparameterparametrization
In statistics, the Wald test (named after Abraham Wald) assesses constraints on statistical parameters based on the weighted distance between the unrestricted estimate and its hypothesized value under the null hypothesis, where the weight is the precision of the estimate.

Estimator

estimatorsestimateestimates
In statistics, the Wald test (named after Abraham Wald) assesses constraints on statistical parameters based on the weighted distance between the unrestricted estimate and its hypothesized value under the null hypothesis, where the weight is the precision of the estimate.

Null hypothesis

nullnull hypotheseshypothesis
In statistics, the Wald test (named after Abraham Wald) assesses constraints on statistical parameters based on the weighted distance between the unrestricted estimate and its hypothesized value under the null hypothesis, where the weight is the precision of the estimate.

Precision (statistics)

precisionprecision matrixconcentration
In statistics, the Wald test (named after Abraham Wald) assesses constraints on statistical parameters based on the weighted distance between the unrestricted estimate and its hypothesized value under the null hypothesis, where the weight is the precision of the estimate.

Sampling distribution

finite sample distributiondistributionsampling
While the finite sample distributions of Wald tests are generally unknown, it has an asymptotic χ 2 -distribution under the null hypothesis, a fact that can be used to determine statistical significance.

Chi-squared distribution

chi-squaredchi-square distributionchi square distribution
While the finite sample distributions of Wald tests are generally unknown, it has an asymptotic χ 2 -distribution under the null hypothesis, a fact that can be used to determine statistical significance.

Statistical significance

statistically significantsignificantsignificance level
While the finite sample distributions of Wald tests are generally unknown, it has an asymptotic χ 2 -distribution under the null hypothesis, a fact that can be used to determine statistical significance.

Statistical hypothesis testing

hypothesis testingstatistical teststatistical tests
Together with the Lagrange multiplier and the likelihood-ratio test, the Wald test is one of three classical approaches to hypothesis testing.

Computational complexity

complexitycomputational burdenamount of computation
An advantage of the Wald test over the other two is that it only requires the estimation of the unrestricted model, which lowers the computational burden as compared to the likelihood-ratio test.

Expression (mathematics)

expressionmathematical expressionexpressions
However, a major disadvantage is that (in finite samples) it is not invariant to changes in the representation of the null hypothesis; in other words, algebraically equivalent expressions of non-linear parameter restriction can lead to different values of the test statistic.

Taylor series

Taylor expansionMaclaurin seriesTaylor polynomial
That is because the Wald statistic is derived from a Taylor expansion, and different ways of writing equivalent nonlinear expressions lead to nontrivial differences in the corresponding Taylor coefficients.

Binomial regression

binary response modelbinomial models
Another aberration, known as the Hauck–Donner effect, can occur in binomial models when the estimated (unconstrained) parameter is close to the boundary of the parameter space—for instance a fitted probability being extremely close to zero or one—which results in the Wald test no longer monotonically increasing in the distance between the unconstrained and constraint parameter.

Boundary (topology)

boundaryboundariesboundary point
Another aberration, known as the Hauck–Donner effect, can occur in binomial models when the estimated (unconstrained) parameter is close to the boundary of the parameter space—for instance a fitted probability being extremely close to zero or one—which results in the Wald test no longer monotonically increasing in the distance between the unconstrained and constraint parameter.

Parameter space

weight space
Another aberration, known as the Hauck–Donner effect, can occur in binomial models when the estimated (unconstrained) parameter is close to the boundary of the parameter space—for instance a fitted probability being extremely close to zero or one—which results in the Wald test no longer monotonically increasing in the distance between the unconstrained and constraint parameter.

Monotonic function

monotonicitymonotonemonotonic
Another aberration, known as the Hauck–Donner effect, can occur in binomial models when the estimated (unconstrained) parameter is close to the boundary of the parameter space—for instance a fitted probability being extremely close to zero or one—which results in the Wald test no longer monotonically increasing in the distance between the unconstrained and constraint parameter.

Maximum likelihood estimation

maximum likelihoodmaximum likelihood estimatormaximum likelihood estimate
Under the Wald test, the estimated that was found as the maximizing argument of the unconstrained likelihood function is compared with a hypothesized value \theta_0.

Likelihood function

likelihoodlikelihood ratiolog-likelihood
Under the Wald test, the estimated that was found as the maximizing argument of the unconstrained likelihood function is compared with a hypothesized value \theta_0.

T-statistic

Student's t-statistict''-statisticStudent's ''t''-statistic
The square root of the single-restriction Wald statistic can be understood as a (pseudo) t-ratio that is, however, not actually t-distributed except for the special case of linear regression.