Assumptions of Linear Regression- Autocorrelation


Autocorrelation is one of the most important assumptions of Linear Regression. The dependent variable ‘y’ is said to be auto correlated when the current value of ‘y; is dependent on its previous value. Is such cases the R-Square (which tells is the how good our model is performing) is said to make no sense.

We build a model, so that given our independent variables we can detect how the dependent variable will move, but If the current month’s ‘y’ is dependent upon previous month’s ’y’ then our model becomes non-robust.

The Null hypothesis ‘ ’ says that that the dependent variable ‘y’ is not linearly auto-correlated. Durbin-Watson test is used to check the auto-correlation in the dependent variable.

durbin watson

It’s important to note that DW test statistics takes value between 0 to 4 since the autocorrelation coefficient only takes values between -1 and 1. So when the autocorrelation coefficient equals 0, the DW equals 2. If DW > 2 we have negative autocorrelation, and if DW < 2 we would have a positive autocorrelation.

Although we can use higher order DW test but only the first order for DW test is important, rest 2,3,4 can be ignored. (DW= specifies the order to be selected).

The ‘DWPROB’ gives us the p-value of DW test from order 1 to 4

However there may be the case when DW test Statics fails- we use Cock-run test to remove Auto-correlation.


  1. Hiya, I am really glad I’ve found this info. Today bloggers publish only about gossips and net and this is really frustrating. A good site with exciting content, this is what I need. Thanks for keeping this site, I’ll be visiting it. Do you do newsletters? Can’t find it.


Please enter your comment!
Please enter your name here