StatlectThe Digital Textbook
Index > Asymptotic theory

Slutsky's theorem

Slutsky's theorem concerns the convergence in distribution of the transformation of two sequences of random vectors, one converging in distribution and the other converging in probability to a constant.

Table of Contents

Joint convergence in distribution

Slutsky's theorem is based on the fact that if a sequence of random vectors converges in distribution and another sequence converges in probability to a constant, then they are jointly convergent in distribution.

Proposition (Joint convergence) Let [eq1] and [eq2] be two sequences of random vectors. If [eq3] and [eq4], where $c$ is a constant, then[eq5]

Proof

In the proposition above we have indicated convergence in probability by [eq6] and convergence in distribution by [eq7].

The theorem

We provide a statement of Slutsky's theorem that is slightly more general than the statement usually found in standard references.

Proposition (Slutsky) Let [eq8] and [eq2] be two sequences of random vectors such that [eq3] and [eq11], where $c$ is a constant. Let [eq12] be a continuous function. Then,[eq13]

Proof

The couple [eq14] is jointly convergent in distribution to $left( X,c
ight) $ by the proposition above (Joint convergence). Therefore, by the Continuous Mapping theorem, the fact that [eq15] is continuous implies that [eq16] converges in distribution to [eq17].

The theorem is valid also when [eq18] and [eq19] are sequences of random matrices (the reason being that random matrices can be thought of as random vectors whose entries have been re-arranged into several columns).

Implications

Since the sum and the product are continuous functions of their operands, Slutsky's theorem implies that[eq20]when [eq3], [eq22] and the dimensions of X_n and $Y_{n}$ are such that their sum and/or their product are well-defined.

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Let [eq18] be a sequence of Kx1 random vectors such that[eq24]where X is a normal random vector with mean mu and invertible covariance matrix V.

Let [eq25] be a sequence of $L	imes K$ random matrices such that[eq26]where A is a constant matrix. Find the limit in distribution of the sequence of products [eq27].

Solution

By Slutsky's theorem[eq28]where[eq29]The random vector Y has a multivariate normal distribution, because it is a linear transformation of a multivariate normal random vector (see the lecture entitled Linear combinations of normal random variables). The expected value of Y is[eq30]and its covariance matrix is[eq31]Therefore, the sequence of products [eq32] converges in distribution to a multivariate normal random vector with mean $Amu $ and covariance matrix $AVA^{intercal }$.

Exercise 2

Let [eq18] be a sequence of Kx1 random vectors such that[eq24]where X is a normal random vector with mean 0 and invertible covariance matrix V.

Let [eq35] be a sequence of $K	imes K$ random matrices such that[eq36]Find the limit in distribution of the sequence[eq37]

Solution

By the Continuous Mapping theorem[eq38]Therefore, by Slutsky's theorem[eq39]Using the Continuous mapping theorem again, we get[eq40]Since V is an invertible covariance matrix, there exists an invertible matrix Sigma such that[eq41]Therefore,[eq42]where we have defined[eq43]The random vector Z has a multivariate normal distribution, because it is a linear transformation of a multivariate normal random vector (see the lecture entitled Linear combinations of normal random variables). The expected value of Z is[eq44]and its covariance matrix is[eq45]Thus, Z has a standard multivariate normal distribution (mean 0 and variance I) and[eq46]is a quadratic form in a standard normal random vector. So, [eq47] has a Chi-square distribution with [eq48] degrees of freedom. In summary, the sequence [eq49] converges in distribution to a Chi-square distribution with K degrees of freedom.

Exercise 3

Let everything be as in the previous exercise, except for the fact that now X has mean mu. Find the limit in distribution of the sequence[eq50]where [eq51] is a sequence of Kx1 random vectors converging in probability to mu.

Solution

Define[eq52]By Slutsky's theorem[eq53]where[eq54]is a multivariate normal random variable with mean 0 and variance V. Thus, we can use the results of the previous exercise on the sequence[eq55]which is the same as[eq56]and we find that it converges in distribution to a Chi-square distribution with K degrees of freedom.

References

van der Vaart, A. W. (2000) Asymptotic Statistics, Cambridge University Press.

The book

Most of the learning materials found on this website are now available in a traditional textbook format.