# Commutation matrix

The commutation matrix (or vec-permutation matrix) is used to transform the vectorization of a matrix into the vectorization of its transpose and to commute the factors of a Kronecker product.

## Definition

Definition A matrix is a commutation matrix if and only iffor any matrix .

A commutation matrix is also called a vec-permutation matrix because, as we will demonstrate, it is a permutation matrix.

## Example

As an example, let us consider the matrix

The two vectorizations are

The commutation matrix is

By carrying out the matrix multiplication, you can check that

## Existence

A commutation matrix always exists, for any and .

To prove its existence, note that and have the same dimension and contain the same entries, arranged in different orders.

In other words, is obtained by permuting the rows of .

But we know that row permutations can be performed by pre-multiplying by a permutation matrix.

Thus, the commutation matrix is a permutation matrix obtained by performing on the identity matrix the same row interchanges that transform into .

## Permutation matrix

The following properties of permutation matrices, which have been proved previously, apply:

1. each row of has one entry equal to and all the other entries equal to ;

2. each column of has one entry equal to and all the other entries equal to ;

3. the rows of form the standard basis of the space of vectors;

4. the columns of form the standard basis of the space of vectors;

5. is full-rank;

6. is orthogonal (i.e., ).

## Orthogonality

By property 6 above (orthogonality) we havewhich implies that

## Relation with the identity matrix

Let us provide a more precise characterization of the relation between the commutation matrix and the identity matrix .

Note that the -th entry of a matrix is equal to:

• the entry of in position ;

• the -th entry of ;

• the entry of in position ;

Therefore:

1. row number of has a in position and s elsewhere;

2. column number of has a in position and s elsewhere.

In other words:

1. row number of is equal to row number of ;

2. column number of is equal to column number of .

## Useful matrices and vectors

In order to prove some results about commutation matrices, we will use:

• matrices denoted by , that have all entries equal to zero, except the -th, which is equal to ;

• vectors denoted by , that have all entries equal to zero, except the -th, which is equal to ;

• vectors denoted by , that have all entries equal to zero, except the -th, which is equal to .

These matrices are such that

## Explicit formula

We can now provide an explicit formula for the commutation matrix.

Proposition A commutation matrix satisfieswhere denotes the Kronecker product.

Proof

Let be any matrix. Then, By taking the vectorization of both sides, we obtainwhere in steps and we have used two properties of the vec operator.

## Characterization as a block matrix

From the explicit formula above, we can see that the commutation matrix is a block matrix having rows and columns of blocks.

Each block has dimension , and the -th block is equal to .

Example If and , then

## Other explicit formulae

The next proposition provides two other explicit formulae.

Proposition A commutation matrix satisfieswhere denotes the identity matrix and denotes the identity matrix.

Proof

We have proved above thatSince the Kronecker product is associative and distributive, and the product of a column by a row is the same as their Kronecker product (in any order), we have:Similarly,

## Special cases

Here are two special cases in which the commutation matrix has a simple form.

Proposition When or , then the commutation matrix is equal to the identity matrix.

Proof

For any vector , we have which implies . By the same token, for any vector , we have which implies .

## Commutation properties

The commutation matrix takes its name from the fact that it can be used to commute the factors of a Kronecker product.

Proposition Let be a matrix and an matrix. Then,

Proof

Take any matrix . Then,where in steps and we have used a property of the vec operator. Hence,for any matrix , which implies that

Proposition (general) Let be a matrix and an matrix. Then,

Proof

We have demonstrated above thatNow pre-multiply both sides of the equation by to obtain the desired result.

Proposition Let be a vector and an vector. Then,

Proof

By the previous proposition, we haveBut .

Proposition Let be a matrix and an vector. Then,

Proof

These are all special cases of the general proposition abovein which one of the two commutation matrices is equal to the identity matrix.

## Other properties

Commutation matrices enjoy several other useful properties that have not been presented in this lecture.

For more details, consult Abadir and Magnus (2005), Harville (2008), Magnus and Neudecker (2019).

## Solved exercises

Below you can find some exercises with explained solutions.

### Exercise 1

Explicitly write the commutation matrix

Solution

We use the characterization as a block matrix:

### Exercise 2

Prove that, when the two indices coincide, the trace of the commutation matrix is

Solution

The explicit formula for the vec-permutation matrix becomes Since the trace is a linear operator and the trace of a Kronecker product equals the product of the traces, we haveThe matrices and have a non-zero diagonal entry (which is unique and equal to ) only when . Therefore,

## References

Abadir, K. M., and Magnus, J. R. (2005) Matrix Algebra, Cambridge University Press.

Harville, D. A. (2008) Matrix Algebra From a Statistician's Perspective, Springer.

Magnus, J. R., and Neudecker, H. (2019) Matrix Differential Calculus with Applications in Statistics and Econometrics, Wiley.