Hi, welcome to the 5^{th} chapter of Time Series Modelling.

As of now we have covered-the statistical properties, components of time series, White Noise and the first block of Time series modelling i.e. Autoregression

In this topic we cover the **Moving Average Model** (also known as MA).

MA process is a function of the infinite and small number of past forecasts. That is rather than using the past values of the forecast variable in a regression (like AR model), a moving average model uses past forecast errors in a regression-like model.

In the above equation:

Where ‘u’ is the mean of the process and e(t) is the white noise process with variance σ^2

MA has two stationary processes a(t) and –Øa(t-1) and therefore they are stationary at any value of the parameter, unlike an AR process.

**Note**: moving average *models* should not be confused with the moving average *smoothing*. A moving average model is used for forecasting future values while moving average smoothing is used for estimating the trend-cycle of past values.

**MA(1):**

A first-order moving average as defined in the above equation is a linear combination of the last two forecasts.

Here we assume that |Ø| <1 so that the past forecasts has less weight than the present. So as to say, that the process is invertible and has the property wherein the effect of past values of the series decreases with time.

However, If |θ| ≥ 1 it produces the paradoxical situation in which the effect of past observations increases with the time.

From here on, we assume that the process is invertible.

**Properties of an MA(1) Model:**

*An only non-zero value in the ACF is for lag 1*. All other autocorrelations are 0. Thus an ACF with a significant autocorrelation only at lag 1 is an indicator of a possible MA(1) model.

To elaborate- we determine the appropriate maximum lag for the estimation by examining the sample autocorrelation function to see where it becomes insignificantly different from zero for all lags beyond a certain lag, which is designated as the maximum lag q

**MA(q):**

In the above equation E(t) is defined by the linear combination of past ‘q’ forecasts.

Its important to note that changing the parameters in an MA model will results in different time series patterns. However, with autoregressive models, the variance of the error term e(t) will only change the scale of the series, not the patterns (as directed in the graph below)

The autocorrelation function (ACF) of an MA(1) process has the same properties as the partial autocorrelation function (PACF) of an AR(1) process- there is a first coefficient different from zero and the rest are null (as explained above).

This structure between the AR(1) and the MA(1) is also seen in the partial autocorrelation function, PACF.