Search for probability and statistics terms on Statlect
StatLect

Poisson distribution - Maximum Likelihood Estimation

by , PhD

In this lecture, we explain how to derive the maximum likelihood estimator (MLE) of the parameter of a Poisson distribution.

Table of Contents

Revision material

Before reading this lecture, you might want to revise the pages on:

The main steps that you need to take in order to derive the MLE of the Poisson distribution: 1) compute the likelihood; 2) take the logarithm; 3) compute the derivative of the log likelihood with respect to the parameter; 4) set the derivative equal to zero; 5) solve the equation and find the MLE.

Assumptions

We observe n independent draws from a Poisson distribution.

In other words, there are n independent Poisson random variables[eq1]and we observe their realizations[eq2]

The probability mass function of a single draw $X_{j}$ is[eq3]where:

The likelihood function

The likelihood function is[eq5]

Proof

The n observations are independent. As a consequence, the likelihood function is equal to the product of their probability mass functions:[eq6]Furthermore, the observed values [eq7] necessarily belong to the support R_X. So, we have[eq8]

The log-likelihood function

The log-likelihood function is [eq9]

Proof

By taking the natural logarithm of the likelihood function derived above, we get the log-likelihood:[eq10]

The maximum likelihood estimator

The maximum likelihood estimator of $lambda $ is[eq11]

Proof

The MLE is the solution of the following maximization problem [eq12]The first order condition for a maximum is [eq13]The first derivative of the log-likelihood with respect to the parameter $lambda $ is[eq14]Impose that the first derivative be equal to zero, and get[eq15]

Therefore, the estimator [eq16] is just the sample mean of the n observations in the sample.

This makes intuitive sense because the expected value of a Poisson random variable is equal to its parameter $lambda _{0}$, and the sample mean is an unbiased estimator of the expected value.

Asymptotic variance

The estimator [eq17] is asymptotically normal with asymptotic mean equal to $lambda _{0}$ and asymptotic variance equal to[eq18]

Proof

The score is[eq19]The Hessian is[eq20]The information equality implies that[eq21]where we have used the fact that the expected value of a Poisson random variable with parameter $lambda _{0}$ is equal to $lambda _{0}$. Finally, the asymptotic variance is[eq22]

Thus, the distribution of the maximum likelihood estimator [eq23] can be approximated by a normal distribution with mean $lambda _{0}$ and variance $lambda _{0}/n$.

Other examples

On StatLect you can find detailed derivations of MLEs for numerous other distributions and statistical models.

TypeSolution
Exponential distributionUnivariate distributionAnalytical
Normal distributionUnivariate distributionAnalytical
T distributionUnivariate distributionNumerical
Multivariate normal distributionMultivariate distributionAnalytical
Normal linear regression modelRegression modelAnalytical
Logistic classification modelClassification modelNumerical
Probit classification modelClassification modelNumerical
Gaussian mixtureMixture of distributionsNumerical (EM)

How to cite

Please cite as:

Taboga, Marco (2021). "Poisson distribution - Maximum Likelihood Estimation", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-statistics/Poisson-distribution-maximum-likelihood.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.