Another assumption of linear regression is ‘Homoscedasticity’. In a linear model the variation in the value of dependent variable ‘y’ is calculated by the Error term- which should be identically and independently distributed.

It means that the variation in the value of the dependent variable ‘y’ (i.e. error term) should be constant for all values of the independent variables ‘X’. This is checked by plotting the value of ‘y’ against the predicted probability calculated using the independent variables.

Example: the variation in expenditure is same with the increase in income.

The opposite of Homoscedasticity is’ Hetroscedasticity’ which means that the variation in the value of ‘y’ (i.e. error term) is not constant against the values of the independent variables ‘x’, which makes the independent variables inefficient and they are no longer Best Linear Unbiased Estimator(BLUE)

Example: The variation in expenditure is not same with the increase in income.

Income | Expenditure | Absolute Variation | Relative Variation | Log Variation |

100 | 80 | |||

100 | 70 | -10 | -13% | 0.13 |

100 | 85 | 5 | 6% | -0.06 |

1000 | 800 | |||

1000 | 700 | -100 | -13% | 0.13 |

1000 | 850 | 50 | 6% | -0.06 |

Log Transformation tends to keep the variation the dependent variable ‘y’ constant.

I’ve learn a few good stuff here. Definitely value bookmarking for revisiting.

I surprise how much attempt you place to make this kind of wonderful informative site.