Heteroskedasticity is the violation of the assumption, commonly made in linear regression models, that all the errors of the regression have the same variance.

Consider the linear regressionwhere is the regressand, are the regressors, is the vector of regression coefficients, is the error term and the observations are .

Sometimes we make the assumption that all the error terms have the same variance, that is,

When this assumption is met, we say that the errors are homoskedastic (or homoscedastic).

On the contrary, when the errors pertaining to different observations do not have the same variance, the assumption is violated and the errors are said to be heteroskedastic (or heteroscedastic). In this case, we also say that the regression suffers from heteroskedasticity.

We often make an assumption stronger than homoskedasticity, called conditional homoskedasticity:where is the design matrix (i.e., the matrix whose rows are the vectors of regressors for ).

In other words, we assume that the variance of the errors is constant conditional on the design matrix.

When this assumption is violated, the regression is said to suffer from conditional heteroskedasticity.

Conditional homoskedasticity is one of the assumptions of the Gauss-Markov theorem, which states that under certain conditions the OLS estimator is the best linear unbiased estimator (BLUE) of the vector of regression coefficients. Therefore, OLS is not guaranteed to be BLUE when a regression suffers from heteroskedasticity.

If all the conditions of the Gauss-Markov theorem except homoskedasticity are met, OLS is not the BLUE estimator, but the weighted least squares estimator (a special case of the generalized least squares estimator) is.

There are numerous statistical tests that can be used to detect heteroskedasticity, for example, the Goldfeld-Quandt, Breusch-Pagan and White tests. Most statistical packages have implementations of these test. For more information about these test, you can refer, for example, to Greene (2017) and Gurajati (2017).

The mathematical details are discussed in the lecture on the generalized least squares estimator.

Greene, W.H. (2017) Econometric analysis, 8th edition, Pearson.

Gujarati, D.N. (2017) Basic econometrics, 5th edition, McGraw-Hill.

Previous entry: Factorial

Next entry: IID sequence

The book

Most of the learning materials found on this website are now available in a traditional textbook format.

Featured pages

- Convergence in probability
- Multivariate normal distribution
- Characteristic function
- Gamma distribution
- Conditional probability
- Chi-square distribution

Explore