StatlectThe Digital Textbook
Index > Fundamentals of statistics > Maximum likelihood - Hypothesis testing

Likelihood ratio test

This lecture provides a detailed presentation of the likelihood ratio (LR) test.

The likelihood ratio test can be used to perform tests of hypotheses about parameters that have been estimated by maximum likelihood.

Before going through this lecture, you are advised to get acquainted with the basics of hypothesis testing in a maximum likelihood framework (see the introductory lecture entitled Maximum likelihood - Hypothesis testing).

The likelihood ratio test is used to verify null hypotheses that can be written in the form:[eq1]where $	heta _{0}$ is an unknown parameter belonging to a parameter space [eq2], and [eq3] is a vector valued function ($rleq p$).

The above formulation of a null hypothesis is quite general, as many common parameter restrictions can be written in the form [eq4] (see the aforementioned introductory lecture).

The likelihood ratio statistic

The likelihood ratio test is based on two different ML estimates of the parameter $	heta _{0}$. One estimate, called unrestricted estimate and denoted by [eq5], is obtained from the solution of the unconstrained maximum likelihood problem[eq6]where $xi _{n}$ is the sample of observed data, and [eq7] is the likelihood function. The other estimate, called restricted estimate and denoted by [eq8], is obtained from the solution of the constrained maximum likelihood problem[eq9]where [eq10]is the set of parameters that satisfy the restriction being tested.

The test statistic, called likelihood ratio statistic, is[eq11]where n is the sample size.

In order to derive the asymptotic properties of the statistic $LR_{n}$, we are going to assume that:

Given these assumptions, the following result can be proved.

Proposition If the null hypothesis [eq13] is true and some technical conditions are satisfied (see above), the likelihood ratio statistic $LR_{n}$ converges in distribution to a Chi-square distribution with $r$ degrees of freedom.

Proof

By the Mean Value Theorem, the second order expansion of [eq14] can be written as [eq15]where [eq16] is the Hessian matrix (a matrix of second partial derivatives) and [eq17] is an intermediate point (to be precise, there are p intermediate points, one for each row of the Hessian). Because the gradient is zero at an unconstrained maximum, we have that[eq18]and, as a consequence, [eq19]Thus, the likelihood ratio statistic can be written as[eq20]By results that can be found in the proof of convergence of the score test statistic, we have that[eq21]where [eq22] is another intermediate point, and that[eq23]where $J_{g}$ is the Jacobian of $g$ and $lambda $ is a Lagrange multiplier[eq24]Note that the expression for the Lagrange multiplier includes a third intermediate point [eq25]. By putting all these things together, we obtain[eq26]where we have defined[eq27]If we also define[eq28]the test statistic can be written as[eq29]where we have used the fact that $V_{1,n}$ is symmetric and we have defined[eq30]

Under the null hypothesis both [eq31] and [eq32] converge in probability to $	heta _{0}$. As a consequence, also [eq33], [eq34] and [eq35] converge in probability to $	heta _{0}$, because their are strictly comprised between the entries of [eq36] and [eq37]. Furthermore, $V_{1,n}$ and $V_{2,n}$ converge in probability to V, the asymptotic covariance matrix of [eq38]. Therefore, by the continuous mapping theorem, we have the following results[eq39]Thus, we can write the likelihood ratio statistic as a sequence of quadratic forms [eq40]where[eq41]and [eq42]As we have proved in the lecture on the Wald test, such a sequence of quadratic forms converges in distribution to a Chi-square random variable with [eq43] degrees of freedom.

Note that the likelihood ratio statistic, unlike the statistics used in the Wald test and in the score test, depends only on parameter estimates and not on their asymptotic covariance matrix. This can be an advantage if the latter is difficult to estimate.

The test

In the likelihood ratio test, the null hypothesis is rejected if[eq44]where $z$ is a pre-specified critical value.

The size of the test can be approximated by its asymptotic value[eq45]

where $Fleft( z
ight) $ is the cumulative distribution function of a Chi-square random variable having $r$ degrees of freedom.

By appropriately choosing $z$, it is possible to achieve a pre-specified size, as follows:[eq46]

Example

The next example illustrates how the likelihood ratio statistic can be used.

Example Let [eq47], that is, the parameter space is the set of all $3$-dimensional real vectors. Denote the three entries of the true parameter $	heta _{0}$ by $	heta _{0,1}$, $	heta _{0,2}$ and $	heta _{0,3}$. The restrictions to be tested are[eq48]so that [eq49] is a function [eq50] defined by[eq51]We have that $r=2$ and the Jacobian of $g$ is[eq52]It has rank $r=2$ because its two rows are linearly independent. Suppose we have obtained the constrained estimate [eq53] and unconstrained one [eq54] and that we know the value of the log-likelihoods in correspondence of the two estimates:[eq55]These two values are used to compute the value of the test statistic: [eq56]According to the rank calculations above, the statistic has a Chi-square distribution with $r=2$ degrees of freedom. Let us fix the size of the test at $lpha =10%$. Then, the critical value $z$ is[eq57]where $Fleft( z
ight) $ is the distribution function of a Chi-square random variable with $2$ degrees of freedom and [eq58] can be calculated with any statistical software (e.g., in MATLAB, with the command chi2inv(0.90,2)). Thus, the test statistic is below the critical value[eq59]As a consequence, the null hypothesis cannot be rejected.

The book

Most learning materials found on this website are now available in a traditional textbook format.