# Normal distribution - Quadratic forms

This lecture presents some important results about quadratic forms involving normal random vectors, that is, about forms of the kind where is a multivariate normal random vector, is a matrix and denotes transposition.

## Review of relevant results from matrix algebra

Before discussing quadratic forms involving normal random vectors, we review some results from matrix algebra that are used throughout the lecture.

### Orthogonal matrices

A real matrix is orthogonal ifwhich also implieswhere is the identity matrix. Of course, if is orthogonal also is orthogonal.

An important property of orthogonal matrices is the following.

Proposition Let be a standard multivariate normal random vector, i.e., . Let be an orthogonal real matrix. Define Then also has a standard multivariate normal distribution, i.e., .

Proof

The random vector has a multivariate normal distribution because it is a linear transformation of another multivariate normal random vector (see the lecture entitled Linear combinations of normal random variables). is standard normal because its expected value isand its covariance matrix iswhere the last equality is an immediate consequence of the definition of orthogonal matrix.

### Symmetric matrices

A real matrix is symmetric ifi.e., equals its transpose.

Real symmetric matrices have the property that they can be decomposed aswhere is an orthogonal matrix and is a diagonal matrix (i.e., a matrix whose off-diagonal entries are zero). The diagonal elements of , which are all real, are the eigenvalues of and the columns of are the eigenvectors of .

### Idempotent matrices

A real matrix is idempotent ifwhich impliesfor any .

### Symmetric idempotent matrices

If a matrix is both symmetric and idempotent then its eigenvalues are either zero or one. In other words, the diagonal entries of the diagonal matrix in the decomposition are either zero or one.

Proof

This can be easily seen as follows:which impliesBut this is possible only if the diagonal entries of are either zero or one.

### Trace of a matrix

Let be a real matrix and denote by the -th entry of (i.e., the entry at the intersection of the -th row and the -th column). The trace of , denoted by , is

In other words, the trace is equal to the sum of all the diagonal entries of .

The trace of enjoys the following important property: where are the eigenvalues of .

## Quadratic forms in standard multivariate normal random vectors

The following proposition shows that certain quadratic forms in standard normal random vectors have a Chi-square distribution.

Proposition Let be a standard multivariate normal random vector, i.e. . Let be a symmetric and idempotent matrix. Let be the trace of . DefineThen has a Chi-square distribution with degrees of freedom.

Proof

Since is symmetric, it can be decomposed aswhere is orthogonal and is diagonal. The quadratic form can be written aswhere we have definedBy the above theorem on orthogonal transformations of standard multivariate normal random vectors, the orthogonality of implies that . Since is diagonal, we can write the quadratic form aswhere is the -th component of and is the -th diagonal entry of . Since is symmetric and idempotent, the diagonal entries of are either zero or one. Denote by the setand by its cardinality, i.e. the number of diagonal entries of that are equal to . Since , we can writeBut the components of a standard normal random vector are mutually independent standard normal random variables. Therefore, is the sum of the squares of independent standard normal random variables. Hence, it has a Chi-square distribution with degrees of freedom (see the lecture entitled Chi-square distribution for details). Finally, by the properties of idempotent matrices and of the trace of a matrix (see above), is not only the sum of the number of diagonal entries of that are equal to , but it is also the sum of the eigenvalues of . Since the trace of a matrix is equal to the sum of its eigenvalues, then .

The proposition above can be used to derive the following extremely useful proposition.

Proposition Let be a multivariate normal random vector with mean and invertible covariance matrix . DefineThen has a Chi-square distribution with degrees of freedom.

Proof

Since is invertible, there exists an invertible matrix such thatTherefore, we havewhere we have definedBeing a linear transformation of a multivariate normal random vector, the vector has a multivariate normal distribution. Its mean isand its covariance matrix isThus, has a standard multivariate normal distribution (mean and variance ) andis a quadratic form in a standard normal random vector. As a consequence, it has a Chi-square distribution with degrees of freedom.

## Independence of quadratic forms in standard multivariate normal random vectors

We start this section with a proposition on the independence between linear transformations.

Proposition Let be a standard multivariate normal random vector, i.e., . Let be a matrix and be a matrix. DefineThen and are two independent random vectors if and only if .

Proof

First of all, note that and are linear transformations of the same multivariate normal random vector . Therefore, they are jointly normal (see the lecture entitled Linear combinations of normal random variables). Their cross-covariance isBut, as we explained in the lecture entitled Multivariate normal distribution - Partitioned vectors, two jointly normal random vectors are independent if and only if their cross-covariance is equal to . In our case, the cross-covariance is equal to zero if and only if , which proves the proposition.

The following proposition gives a necessary and sufficient condition for the independence of two quadratic forms in the same standard multivariate normal random vector.

Proposition Let be a standard multivariate normal random vector, i.e., . Let and be two symmetric and idempotent matrices. DefineThen and are two independent random variables if and only if .

Proof

Since and are symmetric and idempotent, we can writefrom which it is apparent that and can be independent only as long as and are independent. But, by the above proposition on the independence between linear transformations of jointly normal random vectors, and are independent if and only if . Since is symmetric, this is the same as .

The following proposition gives a necessary and sufficient condition for the independence between a quadratic form and a linear transformation involving the same standard multivariate normal random vector.

Proposition Let be a standard multivariate normal random vector, i.e., . Let be a vector and a symmetric and idempotent matrix. DefineThen and are independent if and only if .

Proof

Since is symmetric and idempotent, we can writefrom which it is apparent that and can be independent only as long as and are independent. But, by the above proposition on the independence between linear transformations of jointly normal random vectors, and are independent if and only if . Since is symmetric, this is the same as .

## Examples

We discuss here some quadratic forms that are commonly found in statistics.

### Sample variance as a quadratic form

Let , ..., be independent random variables, all having a normal distribution with mean and variance . Let their sample mean be defined as

and their adjusted sample variance be defined as

Define the following matrix:where is the -dimensional identity matrix and is a vector of ones. In other words, has the following structure:

is a symmetric matrix. By computing the product , it can also be easily verified that is idempotent.

Denote by the random vector whose -th entry is equal to and note that has a multivariate normal distribution with mean and covariance matrix (see the lecture entitled Multivariate normal distribution).

The matrix can be used to write the sample variance as

Now define a new random vectorand note that has a standard (mean zero and covariance ) multivariate normal distribution (see the lecture entitled Linear combinations of normal random variables).

The sample variance can be written as

The last three terms in the sum are equal to zero becausewhich can be verified by directly performing the multiplication of and .

Therefore, the sample varianceis proportional to a quadratic form in a standard normal random vector () and the quadratic form is obtained from a symmetric and idempotent matrix (). Thanks to the propositions above, we know that the quadratic form has a Chi-square distribution with degrees of freedom, where is the trace of . But the trace of is

So, the quadratic form has a Chi-square distribution with degrees of freedom. Multiplying a Chi-square random variable with degrees of freedom by one obtains a Gamma random variable with parameters and (see the lecture entitled Gamma distribution for more details).

So, summing up, the adjusted sample variance has a Gamma distribution with parameters and .

Furthermore, the adjusted sample variance is independent of the sample mean , which is proved as follows. The sample mean can be written asand the sample variance can be written asIf we use the above proposition (independence between a linear transformation and a quadratic form), verifying the independence of and boils down to verifying that which can be easily checked by directly performing the multiplication of and .