# Non-linear least squares

**nonlinear least squaresNLLSnon-linear least-squares estimationnon-linear systemsnonlinearnonlinear least-squares fittingNumerical methods for non-linear least squares**

Non-linear least squares is the form of least squares analysis used to fit a set of m observations with a model that is non-linear in n unknown parameters (m ≥ n).wikipedia

75 Related Articles

### Least squares

**least-squaresmethod of least squaresleast squares method**

Non-linear least squares is the form of least squares analysis used to fit a set of m observations with a model that is non-linear in n unknown parameters (m ≥ n). There are many similarities to linear least squares, but also some significant differences.

Least-squares problems fall into two categories: linear or ordinary least squares and nonlinear least squares, depending on whether or not the residuals are linear in all unknowns.

### Gauss–Newton algorithm

**Gauss-NewtonGauss–NewtonGauss–Newton method**

These equations form the basis for the Gauss–Newton algorithm for a non-linear least squares problem.

The Gauss–Newton algorithm is used to solve non-linear least squares problems.

### Linear least squares

**normal equationslinear least-squaresLinear least squares (mathematics)**

There are many similarities to linear least squares, but also some significant differences.

In contrast, non-linear least squares problems generally must be solved by an iterative procedure, and the problems can be non-convex with multiple optima for the objective function.

### Nonlinear regression

**non-linear regressionnonlinearnon-linear**

It is used in some forms of nonlinear regression.

For details concerning nonlinear data modeling see least squares and non-linear least squares.

### Levenberg–Marquardt algorithm

**Levenberg-Marquardt algorithmLevenberg–MarquardtLevenberg-Marquardt**

This can be achieved by using the Marquardt parameter.

In mathematics and computing, the Levenberg–Marquardt algorithm (LMA or just LM), also known as the damped least-squares (DLS) method, is used to solve non-linear least squares problems.

### Jacobian matrix and determinant

**Jacobian matrixJacobianJacobian determinant**

:The Jacobian, J, is a function of constants, the independent variable and the parameters, so it changes from one iteration to the next.

The Jacobian serves as a linearized design matrix in statistical regression and curve fitting; see non-linear least squares.

### Grey box model

**Grey box completion and validationgrey-box**

Once a selection of non-zero values is made, the remaining coefficients in A can be determined by minimizing m(f,p,Ac) over the data with respect to the nonzero values in A, typically by non-linear least squares.

### Nonlinear programming

**non-linear programmingnonlinear programnonlinear optimization**

### Errors and residuals

**residualserror termresidual**

:is minimized, where the residuals (in-sample prediction errors) r i are given by :

### Maxima and minima

**maximumminimumlocal maximum**

The minimum value of S occurs when the gradient is zero.

### Gradient

**gradientsgradient vectorvector gradient**

The minimum value of S occurs when the gradient is zero.

### Taylor series

**Taylor expansionMaclaurin seriesTaylor polynomial**

At each iteration the model is linearized by approximation to a first-order Taylor polynomial expansion about

### Diagonal matrix

**diagonaldiagonal matricesscalar matrices**

Each element of the diagonal weight matrix W should, ideally, be equal to the reciprocal of the error variance of the measurement.

### Variance

**sample variancepopulation variancevariability**

Each element of the diagonal weight matrix W should, ideally, be equal to the reciprocal of the error variance of the measurement.

### Mathematical optimization

**optimizationmathematical programmingoptimal**

In linear least squares the objective function, S, is a quadratic function of the parameters.

### Quadratic function

**quadraticquadratic polynomialquadratically**

In linear least squares the objective function, S, is a quadratic function of the parameters.

### Parabola

**parabolicparabolic curveparabolic arc**

:When there is only one parameter the graph of S with respect to that parameter will be a parabola.

### Ellipse

**ellipticalellipticeccentricity**

With two or more parameters the contours of S with respect to any pair of parameters will be concentric ellipses (assuming that the normal equations matrix is positive definite).

### Definiteness of a matrix

**positive definitepositive semidefinitepositive-definite**

With two or more parameters the contours of S with respect to any pair of parameters will be concentric ellipses (assuming that the normal equations matrix is positive definite).

### Computer simulation

**computer modelsimulationcomputer modeling**

A good way to do this is by computer simulation.

### Round-off error

**rounding errorrounding errorsroundoff error**

The increment, size should be chosen so the numerical derivative is not subject to approximation error by being too large, or round-off error by being too small.

### Cauchy distribution

**LorentzianCauchyLorentzian distribution**

### Semi-log plot

**log-linearSemi-log graphSemilog graph**

:Graphically this corresponds to working on a semi-log plot.

### Log-normal distribution

**lognormallog-normallognormal distribution**

:This procedure should be avoided unless the errors are multiplicative and log-normally distributed because it can give misleading results.

### Lineweaver–Burk plot

**Lineweaver-Burk plotLineweaver-Burke-diagramKm**

:.The Lineweaver–Burk plot