 StatLect

# Hetroskedasticity-robust standard errors

In linear regression analysis, an estimator of the asymptotic covariance matrix of the OLS estimator is said to be heteroskedasticity-robust if it converges asymptotically to the true value even when the variance of the errors of the regression is not constant.

In this case, also the standard errors, which are equal to the square roots of the diagonal entries of the covariance matrix, are said to be heteroskedasticity-robust. ## The linear regression

Consider the linear regression model where:

• is the dependent variable;

• is the vector of regressors;

• is the vector of regression coefficients;

• is the zero-mean error term.

## Sample

There are observations in the sample: ## The OLS estimator

The ordinary least squares (OLS) estimator of can be written as ## Asymptotic covariance matrix

Under appropriate conditions, the OLS estimator is asymptotically normal: where:

• denotes convergence in distribution;

• is the asymptotic covariance matrix of the OLS estimator;

• denotes a multivariate normal distribution with mean vector equal to and covariance matrix equal to .

## Standard errors

The standard errors are the estimates of the standard deviations of the entries of .

Denote by an estimator of .

Then, the covariance matrix of is approximated by and the standard errors are equal to the square roots of the diagonal entries of the latter matrix.

## Heteroskedasticity

The errors of the regression are said to be conditionally homoskedastic if their variance is constant: where is a constant.

If the conditional variance is not constant, the errors are said to be conditionally heteroskedastic, and the regression is said to be affected by heteroskedasticity.

## Robustness to heteroskedasticity

An estimator of the asymptotic covariance matrix is heteroskedasticity-robust if it is consistent even when the errors are conditionally heteroskedastic.

Consistent means that where denotes convergence in probability.

## Formula for the asymptotic covariance

Under mild technical conditions, the asymptotic covariance matrix is where is the so-called long-run covariance matrix.

## Consistent estimators

The matrix is consistently estimated by Therefore, by the Continuous Mapping Theorem, if we can find a consistent estimator of , then the asymptotic covariance matrix is consistently estimated by ## A simpler formula for the long-run covariance

Suppose that the following assumptions about the sequence hold:

1. zero mean: 2. no serial correlation: if ;

3. weak stationarity: does not depend on Then, the long-run covariance matrix can be written as Proof

The proof is as follows: Note that the zero-mean assumption is the same as the orthogonality assumption usually needed to prove the consistency of the OLS estimator.

## Estimator of the long-run covariance

Under mild technical conditions, the long-run covariance matrix is consistently estimated by where the residuals are defined as Proof

The sample average converges in probability to if the sequence satisfies the conditions of a Law of Large Numbers (the mild technical conditions mentioned above). The errors in the last formula can be replaced by the residuals , as the latter converge in probability to the former when the sample size increases (a formal proof can be found here).

## The robust estimator

If we plug the formula for in the expression we had previously derived for the estimator of the asymptotic covariance matrix, we obtain: This estimator is robust to heteroskedasticity.

As a matter of fact, we did not assume homoskedasticity to prove its consistency.

The square roots of the diagonal entries of the matrix are known as heteroskedasticity-robust standard errors.

## Matrix form

Using matrix notation, we can write the expression above in a more compact form.

Define the vectors and matrices Then, the heteroskedasticity-robust covariance matrix is ## Non-robust estimator

Compare the formulae above with those for the non-robust estimator where This estimator is non-robust to heteroskedasticity.

In fact, in order to prove its consistency, we need to assume conditional homoskedasticity for every with constant.

Proof

Under the hypothesis of homoskedasticity, we can write the long-run covariance matrix as follows: which is consistently estimated by The estimator of the asymptotic covariance matrix becomes: Hence, the estimator of the covariance matrix is ## Synonyms

Heteroskedasticity-robust standard errors go by many different names:

1. heteroskedasticity-consistent standard errors;

2. Eicker-Huber-White standard errors;

3. Huber-White standard errors;

4. White standard errors.

## More details

More mathematical details and proofs of the facts stated above can be found in the lecture on the properties of the OLS estimator.

Previous entry: Realization of a random variable

Next entry: Sample mean