StatlectThe Digital Textbook
Index > Fundamentals of statistics

Estimation methods

In the lecture entitled Point estimation we have defined the concept of an estimator and we have discussed criteria to evaluate estimators, but we have not discussed methods to derive estimators. This lecture discusses general techniques that can be used to derive parameter estimators in a parametric estimation problem.

Before starting, let us recall the main elements of a parametric estimation problem:

Extremum estimators

Several widely employed estimators fall within the class of extremum estimators. An estimator $widehat{	heta }$ is an extremum estimator if it can be represented as the solution of a maximization problem:[eq6]where $Q$ is a function of both the parameter $	heta $ and the sample $xi $.

General conditions can be derived for the consistency and asymptotic normality of extremum estimators. We do not discuss them here (see, e.g., Hayashi, F. (2000) Econometrics, Princeton University Press), but we rather give some examples of extremum estimators and we refer the reader to lectures that describe these examples in a more detailed manner.

Maximum likelihood

In maximum likelihood estimation, we maximize the likelihood of the sample:[eq7]where:

  1. if $Xi $ is discrete, the likelihood [eq8]is the joint probability mass function of $Xi $ associated to the distribution that corresponds to the parameter $	heta $;

  2. if $Xi $ is absolutely continuous, the likelihood [eq9]is the joint probability density function of $Xi $ associated to the distribution that corresponds to the parameter $	heta $.

$widehat{	heta }$ is called the maximum likelihood estimator of $	heta $.

Maximum likelihood estimation is discussed in more detail in the lecture entitled Maximum Likelihood.

Generalized method of moments

In generalized method of moments (GMM) estimation, the distributions associated to the parameters $	heta $ are such that they satisfy the moment condition:[eq10]where [eq11] is a (vector) function and [eq12] indicates that the expected value is computed using the distribution associated to $	heta $. The GMM estimator $widehat{	heta }$ is obtained as[eq13]where [eq14] is a measure of the distance of [eq15] from its expected value of 0 and the estimator is an extremum estimator because[eq16]

Least squares

In least squares estimation the sample $xi $ comprises n realizations $y_{1}$, ..., $y_{n}$ of a random variable $Y_{i}$, called the dependent variable, and n observations $x_{1}$, ...,$x_{n}$ of a random vector X_i, whose components are called independent variables. It is postulated that there exists a function [eq17] such that[eq18]

The least squares estimator $widehat{	heta }$ is obtained as[eq19]

The estimator is an extremum estimator because[eq20]

The book

Most learning materials found on this website are now available in a traditional textbook format.