# Wiener–Khinchin theorem

**Wiener-Khinchin theoremWiener-Khintchine theoremWiener–Khinchin–Einstein theoremWiener–Khintchine theorem**

In applied mathematics, the Wiener–Khinchin theorem, also known as the Wiener–Khintchine theorem and sometimes as the Wiener–Khinchin–Einstein theorem or the Khinchin–Kolmogorov theorem, states that the autocorrelation function of a wide-sense-stationary random process has a spectral decomposition given by the power spectrum of that process.wikipedia

45 Related Articles

### Autocorrelation

**autocorrelation functionserial correlationautocorrelated**

In applied mathematics, the Wiener–Khinchin theorem, also known as the Wiener–Khintchine theorem and sometimes as the Wiener–Khinchin–Einstein theorem or the Khinchin–Kolmogorov theorem, states that the autocorrelation function of a wide-sense-stationary random process has a spectral decomposition given by the power spectrum of that process. For continuous time, the Wiener–Khinchin theorem says that if x is a wide-sense stationary process such that its autocorrelation function (sometimes called autocovariance) defined in terms of statistical expected value, (the asterisk denotes complex conjugate, and of course it can be omitted if the random process is real-valued), exists and is finite at every lag \tau, then there exists a monotone function F(f) in the frequency domain such that

The Wiener–Khinchin theorem relates the autocorrelation function to the power spectral density S_{XX} via the Fourier transform:

### Stationary process

**stationarynon-stationarystationarity**

In applied mathematics, the Wiener–Khinchin theorem, also known as the Wiener–Khintchine theorem and sometimes as the Wiener–Khinchin–Einstein theorem or the Khinchin–Kolmogorov theorem, states that the autocorrelation function of a wide-sense-stationary random process has a spectral decomposition given by the power spectrum of that process.

### Aleksandr Khinchin

**KhinchinA. Y. KhinchinA. Ya. Khinchin**

Norbert Wiener proved this theorem for the case of a deterministic function in 1930; Aleksandr Khinchin later formulated an analogous result for stationary stochastic processes and published that probabilistic analogue in 1934.

### Norbert Wiener

**WienerWiener, NorbertN. Wiener**

Norbert Wiener proved this theorem for the case of a deterministic function in 1930; Aleksandr Khinchin later formulated an analogous result for stationary stochastic processes and published that probabilistic analogue in 1934.

The Wiener–Khinchin theorem, (also known as the Wiener – Khintchine theorem and the Khinchin – Kolmogorov theorem), states that the power spectral density of a wide-sense-stationary random process is the Fourier transform of the corresponding autocorrelation function.

### Spectral density

**frequency spectrumpower spectrumspectrum**

In applied mathematics, the Wiener–Khinchin theorem, also known as the Wiener–Khintchine theorem and sometimes as the Wiener–Khinchin–Einstein theorem or the Khinchin–Kolmogorov theorem, states that the autocorrelation function of a wide-sense-stationary random process has a spectral decomposition given by the power spectrum of that process.

In the latter form (for a stationary random process), one can make the change of variables and with the limits of integration (rather than [0,T]) approaching infinity, the resulting power spectral density and the autocorrelation function of this signal are seen to be Fourier transform pairs (Wiener–Khinchin theorem).

### Linear time-invariant system

**linear time-invariantLTI system theoryLTI system**

The theorem is useful for analyzing linear time-invariant systems (LTI systems) when the inputs and outputs are not square-integrable, so their Fourier transforms do not exist.

The Fourier transform is often applied to spectra of infinite signals via the Wiener–Khinchin theorem even when Fourier transforms of the signals do not exist.

### Spectral theorem

**spectral decompositionspectral theoryspectrum**

In applied mathematics, the Wiener–Khinchin theorem, also known as the Wiener–Khintchine theorem and sometimes as the Wiener–Khinchin–Einstein theorem or the Khinchin–Kolmogorov theorem, states that the autocorrelation function of a wide-sense-stationary random process has a spectral decomposition given by the power spectrum of that process.

### Theorem

**theoremspropositionconverse**

Norbert Wiener proved this theorem for the case of a deterministic function in 1930; Aleksandr Khinchin later formulated an analogous result for stationary stochastic processes and published that probabilistic analogue in 1934.

### Albert Einstein

**EinsteinEinsteinianA. Einstein**

Albert Einstein explained, without proofs, the idea in a brief two-page memo in 1914.

### Autocovariance

**autocovariance functionautocovariance matrixmean and autocovariance**

For continuous time, the Wiener–Khinchin theorem says that if x is a wide-sense stationary process such that its autocorrelation function (sometimes called autocovariance) defined in terms of statistical expected value, (the asterisk denotes complex conjugate, and of course it can be omitted if the random process is real-valued), exists and is finite at every lag \tau, then there exists a monotone function F(f) in the frequency domain such that

### Expected value

**expectationexpectedmean**

For continuous time, the Wiener–Khinchin theorem says that if x is a wide-sense stationary process such that its autocorrelation function (sometimes called autocovariance) defined in terms of statistical expected value, (the asterisk denotes complex conjugate, and of course it can be omitted if the random process is real-valued), exists and is finite at every lag \tau, then there exists a monotone function F(f) in the frequency domain such that

### Monotonic function

**monotonicitymonotonemonotonic**

For continuous time, the Wiener–Khinchin theorem says that if x is a wide-sense stationary process such that its autocorrelation function (sometimes called autocovariance) defined in terms of statistical expected value, (the asterisk denotes complex conjugate, and of course it can be omitted if the random process is real-valued), exists and is finite at every lag \tau, then there exists a monotone function F(f) in the frequency domain such that

### Riemann–Stieltjes integral

**Stieltjes integralRiemann–Stieltjesintegrator**

where the integral is a Riemann–Stieltjes integral.

### Square-integrable function

**square-integrablesquare integrableL'' 2**

The Fourier transform of x(t) does not exist in general, because stationary random functions are not generally either square-integrable or absolutely integrable.

### Absolutely integrable function

**absolutely integrable**

The Fourier transform of x(t) does not exist in general, because stationary random functions are not generally either square-integrable or absolutely integrable.

### Almost everywhere

**almost everyalmost allalmost-everywhere**

But if F(f) is absolutely continuous, for example, if the process is purely indeterministic, then F is differentiable almost everywhere.

### Aliasing

**aliasaliasedtemporal aliasing**

This is due to the problem of aliasing: the contribution of any frequency higher than the Nyquist frequency seems to be equal to that of its alias between 0 and 1.

### Nyquist frequency

**Nyquist limitNyquistN/2 different frequencies**

This is due to the problem of aliasing: the contribution of any frequency higher than the Nyquist frequency seems to be equal to that of its alias between 0 and 1.

### Transfer function

**transfertransfer characteristicchannel transfer function**

Since the Fourier transform of the autocorrelation function of a signal is the power spectrum of the signal, this corollary is equivalent to saying that the power spectrum of the output is equal to the power spectrum of the input times the energy transfer function.

### Discrete Fourier transform

**DFTcircular convolution theoremFourier transform**

Further complicating the issue is that the discrete Fourier transform always exists for digital, finite-length sequences, meaning that the theorem can be blindly applied to calculate auto-correlations of numerical sequences.

### Wold's theorem

**Moving average representationrepresentationWold**

In statistics, Wold's decomposition or the Wold representation theorem (not to be confused with the Wold theorem that is the discrete-time analog of the Wiener–Khinchin theorem), named after Herman Wold, says that every covariance-stationary time series Y_{t} can be written as the sum of two time series, one deterministic and one stochastic.

### List of Russian mathematicians

**Russian mathematicianRussian mathematical schoolRussian mathematics**

### Scale invariance

**scale invariantscale-invariantscaling**

The Wiener–Khinchin theorem further implies that for any sequence that exhibits a variance to mean power law under these conditions will also manifest 1/f noise.