Hi, welcome to the 4^{th} chapter of Time Series Modelling.

As of now we have covered-

How time series is the one whose statistical properties remain constant over time.

What are the components of time series?

What is white noise?

After covering the underlying concepts of time series, we are finally ready to dig deep into the topic.

__AR:__

AR refers to the term Autoregression, where dependent variable is defined by the lagged values of the variable itself at evenly spaced time intervals.

In the above equation the value of Y (t) is regressed over the past value of Y. In this model the explained variable in the previous time period has become the explanatory variable.

The __order__ of an Autoregression is the number the lagged value in the series used to explain Y(t). Hence in the above model is AR(1).

Interestingly, in the above equation-

if the intercept is 0, the series is a white noise.

If the slope is 0, the series is a random walk.

Also, generally viewing it the above equation is of order AR(k), where Y(t) is explained by “k” lagged values of the same variable.

**ACF:**

ACF refers to the Autocorrelation function, which defines the coefficient of correlation between two values in a time series. Taking the above equation, we check the correlation between the values that are “k” time periods apart.

ACF is a way to measure the linear relationship between the explained and explanatory variables at “k” time periods apart, which may require transformation on the series. Example- differencing.

**PACF:**

PACF refers to Partial autocorrelation function, when we calculate correlation on the transformed ACF series we obtain PACF.

PACF is used to choose the number of lags that should be considered to study the AR model.

In the above graph, the ACF shows the relationship of the explained variables with its lagged values- which declines smoothly.

Now, upon transformation of the series (differencing) as mentioned above and measuring the PACF we see the relationship of the explained variable with the transformed lagged values. Here in the right graph we see a sharp decline from lag Y(t-2) onwards – which also gives us the order by which the explained variables has to be defined ( lag Y(t-1) in this case) and also tells us that lag Y(t-2)and above are independent of Y(t)

[…] As of now we have covered-the statistical properties, components of time series, White Noise and the first block of Time series modelling i.e. Autoregression […]