# Autocorrelation

**autocorrelation functionserial correlationautocorrelatedauto-correlationautocorrelation matrixserial dependenceserial independenceserially correlatedserially uncorrelatedauto correlation**

Autocorrelation, also known as serial correlation, is the correlation of a signal with a delayed copy of itself as a function of delay.wikipedia

293 Related Articles

### Missing fundamental

**even if the fundamental is not presentimplied fundamentalmissing fundamental frequency**

The analysis of autocorrelation is a mathematical tool for finding repeating patterns, such as the presence of a periodic signal obscured by noise, or identifying the missing fundamental frequency in a signal implied by its harmonic frequencies.

The precise way in which it does so is still a matter of debate, but the processing seems to be based on an autocorrelation involving the timing of neural impulses in the auditory nerve.

### Autocovariance

**autocovariance functionautocovariance matrixmean and autocovariance**

In some fields, the term is used interchangeably with autocovariance.

Autocovariance is closely related to the autocorrelation of the process in question.

### Wiener–Khinchin theorem

**Wiener-Khinchin theoremWiener-Khintchine theoremWiener–Khinchin–Einstein theorem**

The Wiener–Khinchin theorem relates the autocorrelation function to the power spectral density S_{XX} via the Fourier transform:

In applied mathematics, the Wiener–Khinchin theorem, also known as the Wiener–Khintchine theorem and sometimes as the Wiener–Khinchin–Einstein theorem or the Khinchin–Kolmogorov theorem, states that the autocorrelation function of a wide-sense-stationary random process has a spectral decomposition given by the power spectrum of that process.

### White noise

**whitenoisestatic**

The autocorrelation of a continuous-time white noise signal will have a strong peak (represented by a Dirac delta function) at \tau=0 and will be exactly 0 for all other \tau.

In discrete time, white noise is a discrete signal whose samples are regarded as a sequence of serially uncorrelated random variables with zero mean and finite variance; a single realization of white noise is a random shock.

### Cross-correlation

**cross correlationcorrelationcorrelating**

In an autocorrelation, which is the cross-correlation of a signal with itself, there will always be a peak at a lag of zero, and its size will be the signal energy.

### Fourier transform

**continuous Fourier transformFourierFourier transforms**

The Wiener–Khinchin theorem relates the autocorrelation function to the power spectral density S_{XX} via the Fourier transform:

As a special case, the autocorrelation of function

### Stationary process

**stationarynon-stationarystationarity**

If is a wide-sense stationary process then the mean \mu and the variance \sigma^2 are time-independent, and further the autocovariance function depends only on the lag between t_1 and t_2: the autocovariance depends only on the time-distance between the pair of values but not on their position in time.

This also implies that the autocorrelation depends only on, that is

### Spectral density

**frequency spectrumpower spectrumspectrum**

The Wiener–Khinchin theorem relates the autocorrelation function to the power spectral density S_{XX} via the Fourier transform:

In the latter form (for a stationary random process), one can make the change of variables and with the limits of integration (rather than [0,T]) approaching infinity, the resulting power spectral density and the autocorrelation function of this signal are seen to be Fourier transform pairs (Wiener–Khinchin theorem).

### Ordinary least squares

**OLSleast squaresOrdinary least squares regression**

In ordinary least squares (OLS), the adequacy of a model specification can be checked in part by establishing whether there is autocorrelation of the regression residuals.

The OLS estimator is consistent when the regressors are exogenous, and optimal in the class of linear unbiased estimators when the errors are homoscedastic and serially uncorrelated.

### Durbin–Watson statistic

**Durbin–Watson testDurbin–Watsonautocorrelated residuals**

The traditional test for the presence of first-order autocorrelation is the Durbin–Watson statistic or, if the explanatory variables include a lagged dependent variable, Durbin's h statistic.

In statistics, the Durbin–Watson statistic is a test statistic used to detect the presence of autocorrelation at lag 1 in the residuals (prediction errors) from a regression analysis.

### Optical autocorrelation

**autocorrelationoptical correlatorfemtosecond pulse measurement**

In optics, various autocorrelation functions can be experimentally realized.

### Vector autoregression

**VARvector autoregressive modelstructural VAR estimation**

With multiple interrelated data series, vector autoregression (VAR) or its extensions are used.

### Autoregressive model

**autoregressiveautoregressionAutoregressive process**

Unit root processes, trend stationary processes, autoregressive processes, and moving average processes are specific forms of processes with autocorrelation.

The autocorrelation function of an AR(p) process can be expressed as :

### Breusch–Godfrey test

A more flexible test, covering autocorrelation of higher orders and applicable whether or not the regressors include lags of the dependent variable, is the Breusch–Godfrey test.

In particular, it tests for the presence of serial correlation that has not been included in a proposed model structure and which, if present, would mean that incorrect conclusions would be drawn from other tests, or that sub-optimal estimates of model parameters are obtained if it is not taken into account.

### Multivariate random variable

**random vectorvectormultivariate**

The auto-correlation matrix (also called second moment) of a random vector is an n \times n matrix containing as elements the autocorrelations of all pairs of elements of the random vector \mathbf{X}.

The correlation matrix (also called second moment) of an n \times 1 random vector is an n \times n matrix whose (i,j) th element is the correlation between the i th and the j th random variables.

### Dynamic light scattering

**DLSPhoton Correlation SpectroscopyDynamic Light Scattering (DLS)**

is the autocorrelation function at a particular wave vector,

### Newey–West estimator

**Newey–West HAC estimatorNewey-West estimatorNewey–West**

Responses to nonzero autocorrelation include generalized least squares and the Newey–West HAC estimator (Heteroskedasticity and Autocorrelation Consistent).

The estimator is used to try to overcome autocorrelation (also called serial correlation), and heteroskedasticity in the error terms in the models, often for regressions applied to time series data.

### Fluorescence correlation spectroscopy

**FCSfluorescence imaging**

The resulting electronic signal can be stored either directly as an intensity versus time trace to be analyzed at a later point, or computed to generate the autocorrelation directly (which requires special acquisition cards).

### Galton's problem

Galton's problem, named after Sir Francis Galton, is the problem of drawing inferences from cross-cultural data, due to the statistical phenomenon now called autocorrelation.

### Time series

**time series analysistime-seriestime-series analysis**

It is common practice in some disciplines (e.g. statistics and time series analysis) to normalize the autocovariance function to get a time-dependent Pearson correlation coefficient.

The former include spectral analysis and wavelet analysis; the latter include auto-correlation and cross-correlation analysis.

### Correlogram

**autocorrelation plotlag plotCorrelograms**

For example, in time series analysis, a correlogram, also known as an autocorrelation plot, is a plot of the sample autocorrelations r_h\, versus h\, (the time lags).

### Correlation function

**correlationautocorrelation functioncorrelated**

If one considers the correlation function between random variables representing the same quantity measured at two different points then this is often referred to as an autocorrelation function, which is made up of autocorrelations.

### Triple correlation

The triple correlation extends the concept of autocorrelation, which correlates a function with a single shifted copy of itself and thereby enhances its latent periodicities.

### Prais–Winsten estimation

**Prais–Winsten transformationPrais and WinstenPrais–Winsten estimate**

In econometrics, Prais–Winsten estimation is a procedure meant to take care of the serial correlation of type AR(1) in a linear model.

### Cochrane–Orcutt estimation

**Cochrane–Orcutt procedure**

Cochrane–Orcutt estimation is a procedure in econometrics, which adjusts a linear model for serial correlation in the error term.