StatLect
Index > Fundamentals of statistics > Maximum likelihood

Poisson distribution - Maximum Likelihood Estimation

This lecture explains how to derive the maximum likelihood estimator (MLE) of the parameter of a Poisson distribution. Before reading this lecture, you might want to revise the lectures about maximum likelihood estimation and about the Poisson distribution.

Table of Contents

Assumptions

We assume to observe n inependent draws from a Poisson distribution. In more formal terms, we observe the first n terms of an IID sequence [eq1] of Poisson random variables. Thus, the probability mass function of a term of the sequence $X_{j}$ is[eq2]where R_X is the support of the distribution and $lambda _{0}$ is the parameter of interest (for which we want to derive the MLE). Remember that the support of the Poisson distribution is the set of non-negative integer numbers:[eq3]

To keep things simple, we do not show, but we rather assume that the regularity conditions needed for the consistency and asymptotic normality of the maximum likelihood estimator of $lambda _{0}$ are satisfied.

The likelihood function

The likelihood function is[eq4]

Proof

The n observations are independent. As a consequence, the likelihood function is equal to the product of their probability mass functions:[eq5]Furthermore, the observed values [eq6] necessarily belong to the support R_X. So, we have[eq7]

The log-likelihood function

The log-likelihood function is [eq8]

Proof

By taking the natural logarithm of the likelihood function derived above, we get the log-likelihood:[eq9]

The maximum likelihood estimator

The maximum likelihood estimator of $lambda $ is[eq10]

Proof

The MLE is the solution of the following maximization problem [eq11]The first order condition for a maximum is [eq12]The first derivative of the log-likelihood with respect to the parameter $lambda $ is[eq13]Impose that the first derivative be equal to zero, and get[eq14]

Therefore, the estimator [eq15] is just the sample mean of the n observations in the sample. This makes intuitive sense because the expected value of a Poisson random variable is equal to its parameter $lambda _{0}$, and the sample mean is an unbiased estimator of the expected value.

Asymptotic variance

The estimator [eq16] is asymptotically normal with asymptotic mean equal to $lambda _{0}$ and asymptotic variance equal to[eq17]

Proof

The score is[eq18]The Hessian is[eq19]The information equality implies that[eq20]where we have used the fact that the expected value of a Poisson random variable with parameter $lambda _{0}$ is equal to $lambda _{0}$. Finally, the asymptotic variance is[eq21]

Thus, the distribution of the maximum likelihood estimator [eq22] can be approximated by a normal distribution with mean $lambda _{0}$ and variance $lambda _{0}/n$.

The book

Most of the learning materials found on this website are now available in a traditional textbook format.