StatlectThe Digital Textbook
Index > Fundamentals of statistics > Point estimation

Point estimation of the mean

This lecture presents some examples of point estimation problems, focusing on mean estimation, that is, on using a sample to produce a point estimate of the mean of an unknown distribution.

Normal IID samples

In this example, which is probably the most important in the history of statistics, the sample $xi _{n}$ is made of n independent draws from a normal distribution having unknown mean mu and variance sigma^2. Specifically, we observe n realizations $x_{1}$, ..., $x_{n}$ of n independent random variables X_1, ..., X_n, all having a normal distribution with mean mu and variance sigma^2. The sample is the n-dimensional vector [eq1]which is a realization of the random vector [eq2]

The estimator

As an estimator of the mean mu, we use the sample mean[eq3]

Expected value of the estimator

The expected value of the estimator Xbar_n is equal to the true mean mu. This can be proved using the linearity of the expected value:[eq4]Therefore, the estimator Xbar_n is unbiased.

Variance of the estimator

The variance of the estimator Xbar_n is equal to $sigma ^{2}/n$. This can be proved using the formula for the variance of an independent sum:[eq5]Therefore, the variance of the estimator tends to zero as the sample size n tends to infinity.

Distribution of the estimator

The estimator Xbar_n has a normal distribution:[eq6]

Proof

Note that the sample mean Xbar_n is a linear combination of the normal and independent random variables [eq7] (all the coefficients of the linear combination are equal to $frac{1}{n}$). Therefore, Xbar_n is normal because a linear combination of independent normal random variables is normal. The mean and the variance of the distribution have already been derived above.

Risk of the estimator

The mean squared error of the estimator is[eq8]

Consistency of the estimator

The sequence [eq9] satisfies the conditions of Kolmogorov's Strong Law of Large Numbers ([eq9] is an IID sequence with finite mean). Therefore, the sample mean Xbar_n converges almost surely to the true mean mu:[eq11]that is, the estimator Xbar_n is strongly consistent. Of course, the estimator is also weakly consistent because almost sure convergence implies convergence in probability:[eq12]

IID samples

In this example, the sample $xi _{n}$ is made of n independent draws from a probability distribution having unknown mean mu and variance sigma^2. Specifically, we observe n realizations $x_{1}$, ..., $x_{n}$ of n independent random variables X_1, ..., X_n, all having the same distribution with mean mu and variance sigma^2. The sample is the n-dimensional vector [eq13]which is a realization of the random vector [eq2]The difference with respect to the previous example is that now we are no longer assuming that the sample points come from a normal distribution.

The estimator

Again, the estimator of the mean mu is the sample mean:[eq15]

Expected value of the estimator

The expected value of the estimator Xbar_n is equal to the true mean mu and is therefore unbiased:[eq16]

The proof is the same found in the previous example.

Variance of the estimator

The variance of the estimator Xbar_n is[eq17]

Also in this case the proof is the same found in the previous example.

Distribution of the estimator

Unlike in the previous example, the estimator Xbar_n does not necessarily have a normal distribution (its distribution depends on the distribution of the terms of the sequence [eq9]). However, we will see below that Xbar_n has a normal distribution asymptotically (i.e., it converges to a normal distribution when n becomes large).

Risk of the estimator

The mean squared error of the estimator is[eq19]

The proof is the same found in the previous example.

Consistency of the estimator

The sequence [eq9] satisfies the conditions of Kolmogorov's Strong Law of Large Numbers ([eq9] is an IID sequence with finite mean). Therefore, the estimator Xbar_n is both strongly consistent and weakly consistent (see example above).

Asymptotic normality

The sequence [eq9] satisfies the conditions of Lindeberg-Lévy Central Limit Theorem ([eq9] is an IID sequence with finite mean and variance). Therefore, the sample mean Xbar_n is asymptotically normal: [eq24]where Z is a standard normal random variable and [eq25] denotes convergence in distribution. In other words, the sample mean Xbar_n converges in distribution to a normal random variable with mean mu and variance [eq26].

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Consider an experiment that can have only two outcomes: either success, with probability p, or failure, with probability $1-p$. The probability of success is unknown, but we know that[eq27]Suppose we can independently repeat the experiment as many times as we wish and use the ratio[eq28] as an estimator of p. What is the minimum number of experiments needed in order to be sure that the standard deviation of the estimator is less than $1/100$?

Solution

Denote by $widehat{p}$ the estimator of p. It can be written as[eq29]where n is the number of repetitions of the experiment and [eq30] are n independent random variables having a Bernoulli distribution with parameter p. Therefore, $widehat{p}$ is the sample mean of n independent Bernoulli random variables with expected value p and[eq31]Thus[eq32]We need to ensure that[eq33]or[eq34]which is certainly verified if[eq35]or[eq36]

Exercise 2

Suppose you observe a sample of $100$ independent draws from a distribution having unknown mean mu and known variance $sigma ^{2}=1$. How can you approximate the distribution of their sample mean?

Solution

We can approximate the distribution of the sample mean with its asymptotic distribution. So the distribution of the sample mean can be approximated by a normal distribution with mean mu and variance [eq37]

The book

Most learning materials found on this website are now available in a traditional textbook format.