The mean squared error (MSE) of an estimator is a measure of the expected losses generated by the estimator.
In this page:
we briefly review some concepts that are essential to understand the MSE;
we provide a definition of MSE;
we derive the decomposition of the MSE into bias and variance.
We will always assume, unless stated otherwise, that the parameter to be estimated is a vector.
Let be an unknown parameter to be estimated.
An estimator of , denoted by , is a pre-defined rule that produces an estimate of for each possible sample we can observe.
In other words, is a random variable, influenced by sampling variability, whose realizations are equal to the estimates of .
A loss function is a function that quantifies the losses generated by the estimation errors
Since the estimator is random, we can compute the expected value of the losswhich is called statistical risk.
When a loss function called squared error is used, then the statistical risk is called mean squared error.
Definition Let be an estimator of an unknown parameter . When the squared error is used as a loss function, then the risk is called the mean squared error of the estimator .
In this definition, is the Euclidean norm of a vector, equal to the square root of the sum of the squared entries of the vector.
When is a scalar, the squared error isbecause the Euclidean norm of a scalar is equal to its absolute value.
Therefore, the MSE becomes
The following decomposition is often used to distinguish between the two main sources of error, called bias and variance.
Proposition The mean squared error of an estimator can be written aswhere is the trace of the covariance matrix of andis the bias of the estimator, that is, the expected difference between the estimator and the true value of the parameter.
Suppose the true parameter and its estimator are column vectors. Then, we can write:where: in step we have expanded the products; in steps , and we have used the linearity of the expected value operator; in step we have used the fact that the trace of a square matrix is equal to the sum of its diagonal elements.
When the parameter is a scalar, the above formula for the bias-variance decomposition becomes
Thus, the mean squared error of an unbiased estimator (an estimator that has zero bias) is equal to the variance of the estimator itself.
In the lecture on point estimation, you can find more details about:
loss functions;
statistical risk;
the mean squared error.
In the lecture on predictive models, you can find a different definition of MSE that applies to predictions (not to parameter estimates).
Previous entry: Mean
Next entry: Model misspecification
Please cite as:
Taboga, Marco (2021). "Mean squared error of an estimator", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/glossary/mean-squared-error.
Most of the learning materials found on this website are now available in a traditional textbook format.