Simple linear regression

simple regressioni.e. regression linelinear least squares regression with an intercept term and a single explanatorstandard error of the slope coefficient
In statistics, simple linear regression is a linear regression model with a single explanatory variable.wikipedia
68 Related Articles

Linear regression

regression coefficientmultiple linear regressionregression
In statistics, simple linear regression is a linear regression model with a single explanatory variable.
The case of one explanatory variable is called simple linear regression.

Ordinary least squares

OLSleast squaresOrdinary least squares regression
It is common to make the additional stipulation that the ordinary least squares (OLS) method should be used: the accuracy of each predicted value is measured by its squared residual (vertical distance between the point of the data set and the fitted line), and the goal is to make the sum of these squared deviations as small as possible.
The resulting estimator can be expressed by a simple formula, especially in the case of a simple linear regression, in which there is a single regressor on the right side of the regression equation.

Theil–Sen estimator

Kendall slopeMedian slopeRobust simple linear regression
Other regression methods that can be used in place of ordinary least squares include least absolute deviations (minimizing the sum of absolute values of residuals) and the Theil–Sen estimator (which chooses a line whose slope is the median of the slopes determined by pairs of sample points).
In non-parametric statistics, the Theil–Sen estimator is a method for robustly fitting a line to sample points in the plane (simple linear regression) by choosing the median of the slopes of all lines through pairs of points.

Deming regression

Demingline of best orthogonal fitOrthogonal distance regression
Deming regression (total least squares) also finds a line that fits a set of two-dimensional sample points, but (unlike ordinary least squares, least absolute deviations, and median slope regression) it is not really an instance of simple linear regression, because it does not separate the coordinates into one dependent and one independent variable and could potentially return a vertical line as its fit.
It differs from the simple linear regression in that it accounts for errors in observations on both the x- and the y- axis.

Coefficient of determination

R-squaredR'' 2 R 2
The coefficient of determination ("R squared") is equal to r_{xy}^2 when the model is linear with a single independent variable.
One class of such cases includes that of simple linear regression where r 2 is used instead of R 2.

Homoscedasticity

homoscedastichomogeneity of variancehomoskedastic
It is also possible to evaluate the properties under other assumptions, such as inhomogeneity, but this is discussed elsewhere.
As used in describing simple linear regression analysis, one assumption of the fitted model (to ensure that the least-squares estimators are each a best linear unbiased estimator of the respective population parameters, by the Gauss–Markov theorem) is that the standard deviations of the error terms are constant and do not depend on the x-value.

Linear trend estimation

trendTrend estimationtrends
This can always be done in closed form since this is a case of simple linear regression.

Pearson correlation coefficient

correlation coefficientPearson product-moment correlation coefficientPearson correlation
In this case, the slope of the fitted line is equal to the correlation between y and x corrected by the ratio of standard deviations of these variables.
In this case, it estimates the fraction of the variance in Y that is explained by X in a simple linear regression.

Segmented regression

Linear segmented regressionPiecewise regressionsegmented regression analysis

Statistics

statisticalstatistical analysisstatistician
In statistics, simple linear regression is a linear regression model with a single explanatory variable.

Dependent and independent variables

dependent variableindependent variableexplanatory variable
That is, it concerns two-dimensional sample points with one independent variable and one dependent variable (conventionally, the x and y coordinates in a Cartesian coordinate system) and finds a linear function (a non-vertical straight line) that, as accurately as possible, predicts the dependent variable values as a function of the independent variables.

Cartesian coordinate system

Cartesian coordinatesCartesian coordinateCartesian
That is, it concerns two-dimensional sample points with one independent variable and one dependent variable (conventionally, the x and y coordinates in a Cartesian coordinate system) and finds a linear function (a non-vertical straight line) that, as accurately as possible, predicts the dependent variable values as a function of the independent variables.

Line (geometry)

linestraight linelines
That is, it concerns two-dimensional sample points with one independent variable and one dependent variable (conventionally, the x and y coordinates in a Cartesian coordinate system) and finds a linear function (a non-vertical straight line) that, as accurately as possible, predicts the dependent variable values as a function of the independent variables.

Errors and residuals

residualserror termresidual
It is common to make the additional stipulation that the ordinary least squares (OLS) method should be used: the accuracy of each predicted value is measured by its squared residual (vertical distance between the point of the data set and the fitted line), and the goal is to make the sum of these squared deviations as small as possible.

Least absolute deviations

Least absolute deviationLeast absolute errorsLAD
Other regression methods that can be used in place of ordinary least squares include least absolute deviations (minimizing the sum of absolute values of residuals) and the Theil–Sen estimator (which chooses a line whose slope is the median of the slopes determined by pairs of sample points).

Slope

gradientslopesgradients
Other regression methods that can be used in place of ordinary least squares include least absolute deviations (minimizing the sum of absolute values of residuals) and the Theil–Sen estimator (which chooses a line whose slope is the median of the slopes determined by pairs of sample points).

Median

averagesample medianmedian-unbiased estimator
Other regression methods that can be used in place of ordinary least squares include least absolute deviations (minimizing the sum of absolute values of residuals) and the Theil–Sen estimator (which chooses a line whose slope is the median of the slopes determined by pairs of sample points).

Mathematical model

mathematical modelingmodelmathematical models
Consider the model function

Standard score

normalizednormalisedz-score
is the slope of the regression line of the standardized data points (and that this line passes through the origin).

Correlation and dependence

correlationcorrelatedcorrelations
See sample correlation coefficient for additional details.

Statistical model

modelprobabilistic modelstatistical modeling
Description of the statistical properties of estimators from the simple linear regression estimates requires the use of a statistical model.

Confidence interval

confidence intervalsconfidence levelconfidence
Confidence intervals were devised to give a plausible set of values to the estimates one might have if one repeated the experiment a very large number of times.

Central limit theorem

Lyapunov's central limit theoremlimit theoremscentral limit
The latter case is justified by the central limit theorem.