 StatLect

# Poisson distribution - Maximum Likelihood Estimation

In this lecture, we explain how to derive the maximum likelihood estimator (MLE) of the parameter of a Poisson distribution. ## Revision material

Before reading this lecture, you might want to revise the pages on: ## Assumptions

We observe independent draws from a Poisson distribution.

In other words, there are independent Poisson random variables and we observe their realizations The probability mass function of a single draw is where:

• is the parameter of interest (for which we want to derive the MLE);

• the support of the distribution is the set of non-negative integer numbers: • is the factorial of .

## The likelihood function

The likelihood function is Proof

The observations are independent. As a consequence, the likelihood function is equal to the product of their probability mass functions: Furthermore, the observed values necessarily belong to the support . So, we have ## The log-likelihood function

The log-likelihood function is Proof

By taking the natural logarithm of the likelihood function derived above, we get the log-likelihood: ## The maximum likelihood estimator

The maximum likelihood estimator of is Proof

The MLE is the solution of the following maximization problem The first order condition for a maximum is The first derivative of the log-likelihood with respect to the parameter is Impose that the first derivative be equal to zero, and get Therefore, the estimator is just the sample mean of the observations in the sample.

This makes intuitive sense because the expected value of a Poisson random variable is equal to its parameter , and the sample mean is an unbiased estimator of the expected value.

## Asymptotic variance

The estimator is asymptotically normal with asymptotic mean equal to and asymptotic variance equal to Proof

The score is The Hessian is The information equality implies that where we have used the fact that the expected value of a Poisson random variable with parameter is equal to . Finally, the asymptotic variance is Thus, the distribution of the maximum likelihood estimator can be approximated by a normal distribution with mean and variance .

## Other examples

On StatLect you can find detailed derivations of MLEs for numerous other distributions and statistical models.

TypeSolution
Exponential distributionUnivariate distributionAnalytical
Normal distributionUnivariate distributionAnalytical
T distributionUnivariate distributionNumerical
Multivariate normal distributionMultivariate distributionAnalytical
Normal linear regression modelRegression modelAnalytical
Logistic classification modelClassification modelNumerical
Probit classification modelClassification modelNumerical
Gaussian mixtureMixture of distributionsNumerical (EM)