Search for probability and statistics terms on Statlect
StatLect

Marginal and conditional distributions of a multivariate normal vector

by , PhD

This lecture discusses how to derive the marginal and conditional distributions of one or more entries of a multivariate normal vector.

Table of Contents

The multivariate normal vector

A Kx1 random vector X is multivariate normal if its joint probability density function is[eq1]where:

Partition of the vector

We partition X into two sub-vectors $X_{a}$ and $X_{b}$ such that[eq2]

The sub-vectors $X_{a}$ and $X_{b}$ have dimensions $K_{a}	imes 1$ and $K_{b}	imes 1$ respectively. Moreover, $K_{a}+K_{b}=K$.

Partition of the parameters

We partition the mean vector and the covariance matrix as follows:[eq3]and[eq4]

where:

Normality of the sub-vectors

The following proposition states that the marginal distributions of the two sub-vectors are also multivariate normal.

Proposition Both $X_{a}$ and $X_{b}$ have a multivariate normal distribution:[eq10]

Proof

The random vector $X_{a}$ can be written as a linear transformation of X:[eq11]where A is a $K_{a}	imes K$ matrix whose entries are either zero or one. Thus, $X_{a}$ has a multivariate normal distribution because it is a linear transformation of the multivariate normal random vector X and multivariate normality is preserved by linear transformations (see the lecture on Linear combinations of normal random variables). Also $X_{b}$ has a multivariate normal distribution because it can be written as a linear transformation of X:[eq12]where $B$ is a $K_{b}	imes K$ matrix whose entries are either zero or one.

Independence of the sub-vectors

The following proposition states a necessary and sufficient condition for the independence of the two sub-vectors.

Proposition $X_{a}$ and $X_{b}$ are independent if and only if $V_{ab}=0$.

Proof

$X_{a}$ and $X_{b}$ are independent if and only if their joint moment generating function is equal to the product of their individual moment generating functions (see the lecture entitled Joint moment generating function). Since $X_{a}$ is multivariate normal, its joint moment generating function is[eq13]The joint moment generating function of $X_{b}$ is[eq14]The joint moment generating function of $X_{a}$ and $X_{b}$, which is just the joint moment generating function of X, is[eq15]from which it is obvious that [eq16] if and only if $V_{ab}=0$.

Schur complement

In order to derive the conditional distributions, we are going to rely on the following results, demonstrated in the lecture on Schur complements.

Proposition Let $V_{a}$ be invertible. Let $V/V_{a}$ be the Schur complement of $V_{a}$ in V, defined as[eq17]If $V/V_{a}$ is invertible, then V is invertible and[eq18]

Proposition Let $V_{b}$ be invertible. Let $V/V_{b}$ be the Schur complement of $V_{b}$ in V, defined as[eq19]If $V/V_{b}$ is invertible, then V is invertible and[eq20]

Determinant of a block matrix

We will also need the following results on the determinant of a block matrix.

Proposition If $V_{a}$ is invertible, then[eq21]

Proposition If $V_{b}$ is invertible, then[eq22]

Factorization of joint density functions

Another important result that we are going to use concerns the factorization of joint density functions.

Write the joint density of the multivariate normal vector X as[eq23]

Suppose that we are able to find a factorization[eq24]such that [eq25] is a valid probability density function every time that we fix $x_{b}$ and we see [eq26] as a function of $x_{a}$.

Then, [eq27]where:

Similarly, if we find a factorization [eq30]such that [eq31] is a valid probability density function every time that we fix $x_{a}$ and we see [eq32] as a function of $x_{b}$, then [eq33]

Partition of the precision matrix

The blocks of the inverse of the covariance matrix (known as precision matrix) are denoted as follows:[eq34]

Distributions conditional on realizations

We are now ready to derive the conditional distributions.

Proposition Suppose that $V_{b}$ and its Schur complement in V are invertible. Then, conditional on $X_{b}=x_{b}$, the vector $X_{a}$ has a multivariate normal distribution with mean[eq35]and covariance matrix[eq36]

Proof

First, define[eq37]and note that[eq38]where: in step $rame{A}$ we have used the partition of the precision matrix $H=V^{-1}$; in step $rame{B}$ we have used the formulae for the blocks of the precision matrix based on the Schur complements; in step $rame{C}$ we have defined[eq39]and[eq40]We can now factorize the joint density of $X_{a}$ and $X_{b}$:[eq41]where [eq42]is the density of a multivariate normal vector with mean $mu _{a}^{st }$ and covariance matrix $V_{a}^{st }$, and [eq43]is the density of a multivariate normal vector with mean $mu _{b}$ and covariance matrix $V_{b}$.

Proposition Suppose that $V_{a}$ and its Schur complement in V are invertible. Then, conditional on $X_{a}=x_{a}$, the vector $X_{b}$ has a multivariate normal distribution with mean[eq44]and covariance matrix[eq45]

Proof

Analogous to the previous proof.

How to cite

Please cite as:

Taboga, Marco (2021). "Marginal and conditional distributions of a multivariate normal vector", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/probability-distributions/multivariate-normal-distribution-partitioning.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.