Lack-of-fit sum of squares

error sum of squaressum of squared errors
In statistics, a sum of squares due to lack of fit, or more tersely a lack-of-fit sum of squares, is one of the components of a partition of the sum of squares of residuals in an analysis of variance, used in the numerator in an F-test of the null hypothesis that says that a proposed model fits well.wikipedia
37 Related Articles

Partition of sums of squares

sum of squaressums of squaresSum of squares (statistics)
In statistics, a sum of squares due to lack of fit, or more tersely a lack-of-fit sum of squares, is one of the components of a partition of the sum of squares of residuals in an analysis of variance, used in the numerator in an F-test of the null hypothesis that says that a proposed model fits well.
Note that the residual sum of squares can be further partitioned as the lack-of-fit sum of squares plus the sum of squares due to pure error.

F-test

F''-testF testF-statistic
In statistics, a sum of squares due to lack of fit, or more tersely a lack-of-fit sum of squares, is one of the components of a partition of the sum of squares of residuals in an analysis of variance, used in the numerator in an F-test of the null hypothesis that says that a proposed model fits well.

Analysis of variance

ANOVAanalysis of variance (ANOVA)corrected the means
In statistics, a sum of squares due to lack of fit, or more tersely a lack-of-fit sum of squares, is one of the components of a partition of the sum of squares of residuals in an analysis of variance, used in the numerator in an F-test of the null hypothesis that says that a proposed model fits well.
See also Lack-of-fit sum of squares.

Residual sum of squares

sum of squared residualssum of squares of residualsresidual sum-of-squares
In order for the lack-of-fit sum of squares to differ from the sum of squares of residuals, there must be more than one value of the response variable for at least one of the values of the set of predictor variables.

Errors and residuals

residualserror termresidual
are the residuals, which are observable estimates of the unobservable values of the error term ε ij. Suppose the error terms ε i j are independent and normally distributed with expected value 0 and variance σ 2.

Goodness of fit

goodness-of-fitfitgoodness-of-fit test
The assumptions of normal distribution of errors and independence can be shown to entail that this lack-of-fit test is the likelihood-ratio test of this null hypothesis.
In the analysis of variance, one of the components into which the variance is partitioned may be a lack-of-fit sum of squares.

Statistics

statisticalstatistical analysisstatistician
In statistics, a sum of squares due to lack of fit, or more tersely a lack-of-fit sum of squares, is one of the components of a partition of the sum of squares of residuals in an analysis of variance, used in the numerator in an F-test of the null hypothesis that says that a proposed model fits well.

Fraction (mathematics)

denominatorfractionsfraction
In statistics, a sum of squares due to lack of fit, or more tersely a lack-of-fit sum of squares, is one of the components of a partition of the sum of squares of residuals in an analysis of variance, used in the numerator in an F-test of the null hypothesis that says that a proposed model fits well.

Null hypothesis

nullnull hypotheseshypothesis
In statistics, a sum of squares due to lack of fit, or more tersely a lack-of-fit sum of squares, is one of the components of a partition of the sum of squares of residuals in an analysis of variance, used in the numerator in an F-test of the null hypothesis that says that a proposed model fits well.

Replication (statistics)

replicationreplicatereplicates
In order for the lack-of-fit sum of squares to differ from the sum of squares of residuals, there must be more than one value of the response variable for at least one of the values of the set of predictor variables.

Dependent and independent variables

dependent variableindependent variableexplanatory variable
In order for the lack-of-fit sum of squares to differ from the sum of squares of residuals, there must be more than one value of the response variable for at least one of the values of the set of predictor variables. The pure-error sum of squares is the sum of squared deviations of each value of the dependent variable from the average value over all observations sharing its independent variable value(s).

Least squares

least-squaresmethod of least squaresleast squares method
by the method of least squares.

Degrees of freedom (statistics)

degrees of freedomdegree of freedomEffective degrees of freedom
It is thus constrained to lie in an (N − 2)-dimensional subspace of R N, i.e. there are N − 2 "degrees of freedom for error".

Normal distribution

normally distributedGaussian distributionnormal
The assumptions of normal distribution of errors and independence can be shown to entail that this lack-of-fit test is the likelihood-ratio test of this null hypothesis. Suppose the error terms ε i j are independent and normally distributed with expected value 0 and variance σ 2.

Expected value

expectationexpectedmean
Suppose the error terms ε i j are independent and normally distributed with expected value 0 and variance σ 2.

Variance

sample variancepopulation variancevariability
Suppose the error terms ε i j are independent and normally distributed with expected value 0 and variance σ 2.

Chi-squared distribution

chi-squaredchi-square distributionchi square distribution
has a chi-squared distribution with N − 2 degrees of freedom.

Noncentral chi-squared distribution

noncentral chi-squarednon-central chi-squared distributionnon-central Chi-squared
But the numerator then has a noncentral chi-squared distribution, and consequently the quotient as a whole has a non-central F-distribution.

Noncentral F-distribution

noncentral ''F''-distributionnon-central F-distributionnoncentral ''F''-distributed
But the numerator then has a noncentral chi-squared distribution, and consequently the quotient as a whole has a non-central F-distribution.

Stochastic ordering

stochastically largerstochastic orderstochastically dominates
Since the non-central F-distribution is stochastically larger than the (central) F-distribution, one rejects the null hypothesis if the F-statistic is larger than the critical F value.

Cumulative distribution function

distribution functionCDFcumulative probability distribution function
The critical value corresponds to the cumulative distribution function of the F distribution with x equal to the desired confidence level, and degrees of freedom d 1 = (n − p) and d 2 = (N − n).

F-distribution

F distributionF''-distributionF'' distribution
The critical value corresponds to the cumulative distribution function of the F distribution with x equal to the desired confidence level, and degrees of freedom d 1 = (n − p) and d 2 = (N − n). has an F-distribution with the corresponding number of degrees of freedom in the numerator and the denominator, provided that the model is correct.

Confidence interval

confidence intervalsconfidence levelconfidence
The critical value corresponds to the cumulative distribution function of the F distribution with x equal to the desired confidence level, and degrees of freedom d 1 = (n − p) and d 2 = (N − n).