# Autoregressive–moving-average model

**ARMAautoregressive moving average modelautoregressive moving averageARMAXautoregressive moving-averageautoregressive–moving-averageauto-regressive moving average system identificationauto-regressive or moving average modelAutoregressive moving average (ARMA)Autoregressive moving average with exogenous inputs (ARMAX)**

In the statistical analysis of time series, autoregressive–moving-average (ARMA) models provide a parsimonious description of a (weakly) stationary stochastic process in terms of two polynomials, one for the autoregression (AR) and the second for the moving average (MA).wikipedia

84 Related Articles

### Autoregressive model

**autoregressiveautoregressionAutoregressive process**

In the statistical analysis of time series, autoregressive–moving-average (ARMA) models provide a parsimonious description of a (weakly) stationary stochastic process in terms of two polynomials, one for the autoregression (AR) and the second for the moving average (MA).

Together with the moving-average (MA) model, it is a special case and key component of the more general autoregressive–moving-average (ARMA) and autoregressive integrated moving average (ARIMA) models of time series, which have a more complicated stochastic structure; it is also a special case of the vector autoregressive model (VAR), which consists of a system of more than one interlocking stochastic difference equation in more than one evolving random variable.

### Moving-average model

**Moving average modelmoving averagemoving average process**

In the statistical analysis of time series, autoregressive–moving-average (ARMA) models provide a parsimonious description of a (weakly) stationary stochastic process in terms of two polynomials, one for the autoregression (AR) and the second for the moving average (MA).

Together with the autoregressive (AR) model, the moving-average model is a special case and key component of the more general ARMA and ARIMA models of time series, which have a more complicated stochastic structure.

### Time series

**time series analysistime-seriestime-series analysis**

In the statistical analysis of time series, autoregressive–moving-average (ARMA) models provide a parsimonious description of a (weakly) stationary stochastic process in terms of two polynomials, one for the autoregression (AR) and the second for the moving average (MA).

Combinations of these ideas produce autoregressive moving average (ARMA) and autoregressive integrated moving average (ARIMA) models.

### Box–Jenkins method

**Box–Jenkins Box–Jenkins approachBox–Jenkins analysis**

ARMA models can be estimated by using the Box–Jenkins method.

In time series analysis, the Box–Jenkins method, named after the statisticians George Box and Gwilym Jenkins, applies autoregressive moving average (ARMA) or autoregressive integrated moving average (ARIMA) models to find the best fit of a time-series model to past values of a time series.

### Stationary process

**stationarynon-stationarystationarity**

Other examples of a discrete-time stationary process with continuous sample space include some autoregressive and moving average processes which are both subsets of the autoregressive moving average model.

### Gwilym Jenkins

**Jenkins, GwilymGwilym M. JenkinsJenkins**

The general ARMA model was described in the 1951 thesis of Peter Whittle, Hypothesis testing in time series analysis, and it was popularized in the 1970 book by George E. P. Box and Gwilym Jenkins.

He is most notable for his pioneering work with George Box on autoregressive moving average models, also called Box–Jenkins models, in time-series analysis.

### Lag operator

**backshift operatorlagLag or backshift operator**

In some texts the models will be specified in terms of the lag operator L.

Polynomials of the lag operator can be used, and this is a common notation for ARMA (autoregressive moving average) models.

### Autoregressive integrated moving average

**ARIMAAutoregressive integrated moving average modelAutoregressive integrated moving average (ARIMA)**

See also autoregressive conditional heteroskedasticity (ARCH) models and autoregressive integrated moving average (ARIMA) models.

In statistics and econometrics, and in particular in time series analysis, an autoregressive integrated moving average (ARIMA) model is a generalization of an autoregressive moving average (ARMA) model.

### Autoregressive conditional heteroskedasticity

**GARCHARCHARCH model**

See also autoregressive conditional heteroskedasticity (ARCH) models and autoregressive integrated moving average (ARIMA) models.

The ARCH model is appropriate when the error variance in a time series follows an autoregressive (AR) model; if an autoregressive moving average (ARMA) model is assumed for the error variance, the model is a generalized autoregressive conditional heteroskedasticity (GARCH) model.

### Predictive analytics

**predictiveCARTpredictive analysis**

The Box–Jenkins methodology (1976) developed by George Box and G.M. Jenkins combines the AR and MA models to produce the ARMA (autoregressive moving average) model, which is the cornerstone of stationary time series analysis.

### Exponential smoothing

**basic exponential smoothingDouble exponential smoothingexponential**

### Statistics

**statisticalstatistical analysisstatistician**

### Peter Whittle (mathematician)

**Peter WhittleWhittle, Peter**

The general ARMA model was described in the 1951 thesis of Peter Whittle, Hypothesis testing in time series analysis, and it was popularized in the 1970 book by George E. P. Box and Gwilym Jenkins. The general ARMA model was described in the 1951 thesis of Peter Whittle, who used mathematical analysis (Laurent series and Fourier analysis) and statistical inference.

### Errors and residuals

**residualserror termresidual**

The MA part involves modeling the error term as a linear combination of error terms occurring contemporaneously and at various times in the past.

### Linear combination

**linear combinationslinearly combined(finite) left ''R''-linear combinations**

The MA part involves modeling the error term as a linear combination of error terms occurring contemporaneously and at various times in the past.

### Parameter

**parametersparametricargument**

where are parameters, c is a constant, and the random variable is white noise.

### White noise

**whitenoisestatic**

where are parameters, c is a constant, and the random variable is white noise.

### Laurent series

**Laurent expansion theoremLaurent power seriesfield of Laurent series**

The general ARMA model was described in the 1951 thesis of Peter Whittle, who used mathematical analysis (Laurent series and Fourier analysis) and statistical inference.

### Fourier analysis

**FourierFourier synthesisanalyse the output wave into its constituent harmonics**

The general ARMA model was described in the 1951 thesis of Peter Whittle, who used mathematical analysis (Laurent series and Fourier analysis) and statistical inference.

### Independent and identically distributed random variables

**independent and identically distributedi.i.d.iid**

The error terms are generally assumed to be independent identically distributed random variables (i.i.d.) sampled from a normal distribution with zero mean: ~ N(0,σ 2 ) where σ 2 is

### Normal distribution

**normally distributedGaussian distributionnormal**

The error terms are generally assumed to be independent identically distributed random variables (i.i.d.) sampled from a normal distribution with zero mean: ~ N(0,σ 2 ) where σ 2 is

### George E. P. Box

**George BoxGeorge Edward Pelham BoxBox, George E. P.**

The general ARMA model was described in the 1951 thesis of Peter Whittle, Hypothesis testing in time series analysis, and it was popularized in the 1970 book by George E. P. Box and Gwilym Jenkins.

### Partial autocorrelation function

**Partial autocorrelationPACF**

Finding appropriate values of p and q in the ARMA(p,q) model can be facilitated by plotting the partial autocorrelation functions for an estimate of p, and likewise using the autocorrelation functions for an estimate of q.

### Autocorrelation

**autocorrelation functionserial correlationautocorrelated**

Finding appropriate values of p and q in the ARMA(p,q) model can be facilitated by plotting the partial autocorrelation functions for an estimate of p, and likewise using the autocorrelation functions for an estimate of q.

### Akaike information criterion

**AICAIC-basedAICc**

Brockwell & Davis recommend using Akaike information criterion (AIC) for finding p and q.