In the theory of point estimation, a loss function quantifies the losses associated to the errors committed while estimating a parameter. Often the expected value of the loss, called statistical risk, is used to compare two or more estimators: in such comparisons, the estimator having the least expected loss is usually deemed preferable.
Table of contents
The following is a possible definition.
Definition
Let
be an unknown parameter and
an estimate of
.
The estimation error is the
difference
The
loss function is a function mapping estimation errors to the set of real
numbers.
Let
denote
the Euclidean norm. Commonly used loss functions are:
the absolute estimation error
which
coincides with the absolute value of the error
when the parameter is a scalar;
the squared estimation error
which
coincides with the the square of the error
when the parameter is a scalar.
In both cases, the larger the estimation error is, the larger is the loss. The expected value of the former is called mean absolute error (MAE), while the expectation of the latter is known as mean squared error (MSE).
Loss functions, estimation errors and statistical risk are explained in more detailed in the lecture entitled Point estimation.
Previous entry: Log likelihood
Next entry: Marginal distribution function
Most of the learning materials found on this website are now available in a traditional textbook format.