Least squares

least-squaresmethod of least squaresleast squares methodleast-squares analysisleast-squares estimationordinary least squaresleast squareleast-squares fitleast-squares methodfitted
The method of least squares is a standard approach in regression analysis to approximate the solution of overdetermined systems (sets of equations in which there are more equations than unknowns) by minimizing the sum of the squares of the residuals made in the results of every single equation.wikipedia
385 Related Articles

Non-linear least squares

nonlinear least squaresNLLSnon-linear least-squares estimation
Least-squares problems fall into two categories: linear or ordinary least squares and nonlinear least squares, depending on whether or not the residuals are linear in all unknowns.
Non-linear least squares is the form of least squares analysis used to fit a set of m observations with a model that is non-linear in n unknown parameters (m ≥ n).

Regression analysis

regressionmultiple regressionregression model
The method of least squares is a standard approach in regression analysis to approximate the solution of overdetermined systems (sets of equations in which there are more equations than unknowns) by minimizing the sum of the squares of the residuals made in the results of every single equation.
The earliest form of regression was the method of least squares, which was published by Legendre in 1805, and by Gauss in 1809.

Ordinary least squares

OLSleast squaresOrdinary least squares regression
Least-squares problems fall into two categories: linear or ordinary least squares and nonlinear least squares, depending on whether or not the residuals are linear in all unknowns.
OLS chooses the parameters of a linear function of a set of explanatory variables by the principle of least squares: minimizing the sum of the squares of the differences between the observed dependent variable (values of the variable being observed) in the given dataset and those predicted by the linear function.

Polynomial least squares

Polynomial least squares describes the variance in a prediction of the dependent variable as a function of the independent variable and the deviations from the fitted curve.
These methods include polynomial regression, curve fitting, linear regression, least squares, ordinary least squares, simple linear regression, linear least squares, approximation theory and method of moments.

Curve fitting

nominalbest-fitbest fit
The most important application is in data fitting.
The least squares method is one way to compare the deviations.

Generalized linear model

generalized linear modelslink functiongeneralised linear model
Also, by iteratively applying local quadratic approximation to the likelihood (through the Fisher information), the least-squares method may be used to fit a generalized linear model.
Other approaches, including Bayesian approaches and least squares fits to variance stabilized responses, have been developed.

Normal distribution

normally distributedGaussian distributionnormal
However, to Gauss's credit, he went beyond Legendre and succeeded in connecting the method of least squares with the principles of probability and to the normal distribution.
Moreover, many results and methods (such as propagation of uncertainty and least squares parameter fitting) can be derived analytically in explicit form when the relevant variables are normally distributed.

Adrien-Marie Legendre

LegendreAdrien Marie LegendreAdrien Marie Le Gendre
The least-squares method is usually credited to Carl Friedrich Gauss (1795), but it was first published by Adrien-Marie Legendre (1805).
He developed the least squares method and firstly communicated it to his contemporaries before Gauss, which has broad application in linear regression, signal processing, statistics, and curve fitting; this was published in 1806 as an appendix to his book on the paths of comets.

Linear least squares

normal equationslinear least-squaresLinear least squares (mathematics)
See linear least squares for a fully worked out example of this model.
Linear least squares (LLS) is the least squares approximation of linear functions to data.

Robert Adrain

AdrainAdrain, Robert
The idea of least-squares analysis was also independently formulated by the American Robert Adrain in 1808.
He is chiefly remembered for his formulation of the method of least squares.

Total least squares

orthogonal regressiontotal least squares algorithmstotal least squares problem
This regression formulation considers only observational errors in the dependent variable (but the alternative total least squares regression can account for errors in both variables).
In applied statistics, total least squares is a type of errors-in-variables regression, a least squares data modeling technique in which observational errors on both dependent and independent variables are taken into account.

Carl Friedrich Gauss

GaussCarl GaussCarl Friedrich Gauß
The least-squares method is usually credited to Carl Friedrich Gauss (1795), but it was first published by Adrien-Marie Legendre (1805).
It introduced the Gaussian gravitational constant, and contained an influential treatment of the method of least squares, a procedure used in all sciences to this day to minimize the impact of measurement error.

Probability

probabilisticprobabilitieschance
Adrien-Marie Legendre (1805) developed the method of least squares, and introduced it in his Nouvelles méthodes pour la détermination des orbites des comètes (New Methods for Determining the Orbits of Comets).

Fisher information

Fisher information matrixinformation matrixinformation
Also, by iteratively applying local quadratic approximation to the likelihood (through the Fisher information), the least-squares method may be used to fit a generalized linear model.
In this case the Fisher information matrix may be identified with the coefficient matrix of the normal equations of least squares estimation theory.

Overdetermined system

overdeterminedover-determined system overdetermined equation system
The method of least squares is a standard approach in regression analysis to approximate the solution of overdetermined systems (sets of equations in which there are more equations than unknowns) by minimizing the sum of the squares of the residuals made in the results of every single equation.

Pierre-Simon Laplace

LaplacePierre Simon LaplacePierre-Simon de Laplace
The fourth chapter of this treatise includes an exposition of the method of least squares, a remarkable testimony to Laplace's command over the processes of analysis.

Roger Cotes

Coates [''sicCotes, RogerR. Cotes
Cotes discovered an important theorem on the n-th roots of unity, foresaw the method of least squares, and discovered a method for integrating rational fractions with binomial denominators.

Gauss's method

efficient method
An early demonstration of the strength of Gauss's method came when it was used to predict the future location of the newly discovered asteroid Ceres.
There are techniques/methods available that can be used but why not use Gauss's own method, least squares method (still popularly used today).

Tikhonov regularization

ridge regressionregularizeda squared regularizing function
Tikhonov regularization (or ridge regression) adds a constraint that \|\beta\|^2, the L 2 -norm of the parameter vector, is not greater than a given value.
The approach can be conceptualized by posing a constraint to the least squares problem, such that

Least squares adjustment

Adjustment of observationsadjustedadjustment
Least squares adjustment is a model for the solution of an overdetermined system of equations based on the principle of least squares of observation residuals.

Degrees of freedom (statistics)

degrees of freedomdegree of freedomEffective degrees of freedom
The denominator, n − m, is the statistical degrees of freedom; see effective degrees of freedom for generalizations.
An example which is only slightly less simple is that of least squares estimation of a and b in the model

Compressed sensing

compressive sensingcompressed sensing techniquesCompressed-Sensing
For this reason, the Lasso and its variants are fundamental to the field of compressed sensing.
In statistics, the least squares method was complemented by the L^1-norm, which was introduced by Laplace.

Least absolute deviations

Least absolute deviationLeast absolute errorsLAD
Similar to the least squares technique, it attempts to find a function which closely approximates a set of data.

Regularization (mathematics)

regularizationregularizedregularize
In some contexts a regularized version of the least squares solution may be preferable.
The learning problem with the least squares loss function and Tikhonov regularization can be solved analytically.