Moving-average model

Moving average modelmoving averagemoving average processMoving-averageMA(''q'')moving average regression modelmoving-average (MA) model
In time series analysis, the moving-average model (MA model), also known as moving-average process, is a common approach for modeling univariate time series.wikipedia
34 Related Articles

Autoregressive model

autoregressiveautoregressionAutoregressive process
Together with the autoregressive (AR) model, the moving-average model is a special case and key component of the more general ARMA and ARIMA models of time series, which have a more complicated stochastic structure.
Together with the moving-average (MA) model, it is a special case and key component of the more general autoregressive–moving-average (ARMA) and autoregressive integrated moving average (ARIMA) models of time series, which have a more complicated stochastic structure; it is also a special case of the vector autoregressive model (VAR), which consists of a system of more than one interlocking stochastic difference equation in more than one evolving random variable.

Autoregressive–moving-average model

ARMAautoregressive moving average modelautoregressive moving average
Together with the autoregressive (AR) model, the moving-average model is a special case and key component of the more general ARMA and ARIMA models of time series, which have a more complicated stochastic structure.
In the statistical analysis of time series, autoregressive–moving-average (ARMA) models provide a parsimonious description of a (weakly) stationary stochastic process in terms of two polynomials, one for the autoregression (AR) and the second for the moving average (MA).

Time series

time series analysistime-seriestime-series analysis
Together with the autoregressive (AR) model, the moving-average model is a special case and key component of the more general ARMA and ARIMA models of time series, which have a more complicated stochastic structure. In time series analysis, the moving-average model (MA model), also known as moving-average process, is a common approach for modeling univariate time series.
The parametric approaches assume that the underlying stationary stochastic process has a certain structure which can be described using a small number of parameters (for example, using an autoregressive or moving average model).

Autoregressive integrated moving average

ARIMAAutoregressive integrated moving average modelAutoregressive integrated moving average (ARIMA)
Together with the autoregressive (AR) model, the moving-average model is a special case and key component of the more general ARMA and ARIMA models of time series, which have a more complicated stochastic structure.
Non-seasonal ARIMA models are generally denoted ARIMA(p,d,q) where parameters p, d, and q are non-negative integers, p is the order (number of time lags) of the autoregressive model, d is the degree of differencing (the number of times the data have had past values subtracted), and q is the order of the moving-average model.

Stationary process

stationarynon-stationarystationarity
Contrary to the AR model, the finite MA model is always stationary.
Other examples of a discrete-time stationary process with continuous sample space include some autoregressive and moving average processes which are both subsets of the autoregressive moving average model.

Moving average

exponential moving averagesimple moving averageWeighted moving average
The moving-average model should not be confused with the moving average, a distinct concept despite some similarities.
In a moving average regression model, a variable of interest is assumed to be a weighted moving average of unobserved independent error terms; the weights in the moving average are parameters to be estimated.

Autocorrelation

autocorrelation functionserial correlationautocorrelated
The autocorrelation function (ACF) of an MA(q) process is zero at lag q + 1 and greater.
Unit root processes, trend stationary processes, autoregressive processes, and moving average processes are specific forms of processes with autocorrelation.

White noise

whitenoisestatic
where μ is the mean of the series, the θ 1, ..., θ q are the parameters of the model and the ε t, ε t−1,..., ε t−q are white noise error terms.
In this case the noise process is often modeled as a moving average process, in which the current value of the dependent variable depends on current and past values of a sequential white noise process.

Lag operator

backshift operatorlagLag or backshift operator
This can be equivalently written in terms of the backshift operator B as

Box–Jenkins method

Box–Jenkins Box–Jenkins approachBox–Jenkins analysis
Sometimes the ACF and partial autocorrelation function (PACF) will suggest that an MA model would be a better model choice and sometimes both AR and MA terms should be used in the same model (see Box–Jenkins method#Identify p and q).
The autocorrelation function of a MA(q) process becomes zero at lag q + 1 and greater, so we examine the sample autocorrelation function to see where it essentially becomes zero.

Univariate

In time series analysis, the moving-average model (MA model), also known as moving-average process, is a common approach for modeling univariate time series.

Linear prediction

linearlySignal predictioncoefficient
The moving-average model specifies that the output variable depends linearly on the current and various past values of a stochastic (imperfectly predictable) term.

Stochastic

stochasticsstochastic musicstochasticity
The moving-average model specifies that the output variable depends linearly on the current and various past values of a stochastic (imperfectly predictable) term.

Linear regression

regression coefficientmultiple linear regressionregression
Thus, a moving-average model is conceptually a linear regression of the current value of the series against current and previous (observed) white noise error terms or random shocks.

Normal distribution

normally distributedGaussian distributionnormal
The random shocks at each point are assumed to be mutually independent and to come from the same distribution, typically a normal distribution, with location at zero and constant scale.

Finite impulse response

FIRFIR filterFinite Impulse Response (FIR)
The moving-average model is essentially a finite impulse response filter applied to white noise, with some additional interpretation placed on it.

Vector autoregression

VARvector autoregressive modelstructural VAR estimation
Second, in the MA model a shock affects X values only for the current period and q periods into the future; in contrast, in the AR model a shock affects X values infinitely far into the future, because affects X_t, which affects X_{t+1}, which affects X_{t+2}, and so on forever (see Vector autoregression#Impulse response).

Curve fitting

nominalbest-fitbest fit
This means that iterative non-linear fitting procedures need to be used in place of linear least squares.

Partial autocorrelation function

Partial autocorrelationPACF
Sometimes the ACF and partial autocorrelation function (PACF) will suggest that an MA model would be a better model choice and sometimes both AR and MA terms should be used in the same model (see Box–Jenkins method#Identify p and q).

Weight function

weighted sumweightedweights
Similarly, a moving average model specifies an evolving variable as a weighted average of current and various lagged values of a random variable.

Autoregressive conditional heteroskedasticity

GARCHARCHARCH model
Exponentially weighted moving average (EWMA) is an alternative model in a separate class of exponential smoothing models.