The commutation matrix (or vec-permutation matrix) is used to transform the vectorization of a matrix into the vectorization of its transpose and to commute the factors of a Kronecker product.
Table of contents
We start with a definition.
Definition A matrix is a commutation matrix if and only iffor any matrix .
A commutation matrix is also called a vec-permutation matrix because, as we will demonstrate, it is a permutation matrix.
As an example, let us consider the matrix
The two vectorizations are
The commutation matrix is
By carrying out the matrix multiplication, you can check that
A commutation matrix always exists, for any and .
To prove its existence, note that and have the same dimension and contain the same entries, arranged in different orders.
In other words, is obtained by permuting the rows of .
But we know that row permutations can be performed by pre-multiplying by a permutation matrix.
Thus, the commutation matrix is a permutation matrix obtained by performing on the identity matrix the same row interchanges that transform into .
The following properties of permutation matrices, which have been proved previously, apply:
each row of has one entry equal to and all the other entries equal to ;
each column of has one entry equal to and all the other entries equal to ;
the rows of form the standard basis of the space of vectors;
the columns of form the standard basis of the space of vectors;
is full-rank;
is orthogonal (i.e., ).
By property 6 above (orthogonality) we havewhich implies that
Let us provide a more precise characterization of the relation between the commutation matrix and the identity matrix .
Note that the -th entry of a matrix is equal to:
the entry of in position ;
the -th entry of ;
the entry of in position ;
Therefore:
row number of has a in position and s elsewhere;
column number of has a in position and s elsewhere.
In other words:
row number of is equal to row number of ;
column number of is equal to column number of .
In order to prove some results about commutation matrices, we will use:
matrices denoted by , that have all entries equal to zero, except the -th, which is equal to ;
vectors denoted by , that have all entries equal to zero, except the -th, which is equal to ;
vectors denoted by , that have all entries equal to zero, except the -th, which is equal to .
These matrices are such that
We can now provide an explicit formula for the commutation matrix.
Proposition A commutation matrix satisfieswhere denotes the Kronecker product.
Let be any matrix. Then, By taking the vectorization of both sides, we obtainwhere in steps and we have used two properties of the vec operator.
From the explicit formula above, we can see that the commutation matrix is a block matrix having rows and columns of blocks.
Each block has dimension , and the -th block is equal to .
Example If and , then
The next proposition provides two other explicit formulae.
Proposition A commutation matrix satisfieswhere denotes the identity matrix and denotes the identity matrix.
We have proved above thatSince the Kronecker product is associative and distributive, and the product of a column by a row is the same as their Kronecker product (in any order), we have:Similarly,
Here are two special cases in which the commutation matrix has a simple form.
Proposition When or , then the commutation matrix is equal to the identity matrix.
For any vector , we have which implies . By the same token, for any vector , we have which implies .
The commutation matrix takes its name from the fact that it can be used to commute the factors of a Kronecker product.
Proposition Let be a matrix and an matrix. Then,
Take any matrix . Then,where in steps and we have used a property of the vec operator. Hence,for any matrix , which implies that
Proposition (general) Let be a matrix and an matrix. Then,
We have demonstrated above thatNow pre-multiply both sides of the equation by to obtain the desired result.
Proposition Let be a vector and an vector. Then,
By the previous proposition, we haveBut .
Proposition Let be a matrix and an vector. Then,
These are all special cases of the general proposition abovein which one of the two commutation matrices is equal to the identity matrix.
Commutation matrices enjoy several other useful properties that have not been presented in this lecture.
For more details, consult Abadir and Magnus (2005), Harville (2008), Magnus and Neudecker (2019).
Below you can find some exercises with explained solutions.
Explicitly write the commutation matrix
We use the characterization as a block matrix:
Prove that, when the two indices coincide, the trace of the commutation matrix is
The explicit formula for the vec-permutation matrix becomes Since the trace is a linear operator and the trace of a Kronecker product equals the product of the traces, we haveThe matrices and have a non-zero diagonal entry (which is unique and equal to ) only when . Therefore,
Abadir, K. M., and Magnus, J. R. (2005) Matrix Algebra, Cambridge University Press.
Harville, D. A. (2008) Matrix Algebra From a Statistician's Perspective, Springer.
Magnus, J. R., and Neudecker, H. (2019) Matrix Differential Calculus with Applications in Statistics and Econometrics, Wiley.
Please cite as:
Taboga, Marco (2021). "Commutation matrix", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/commutation-matrix.
Most of the learning materials found on this website are now available in a traditional textbook format.