Search for probability and statistics terms on Statlect
StatLect

Jordan form

by , PhD

A matrix is said to be in Jordan form if 1) its diagonal entries are equal to its eigenvalues; 2) its supradiagonal entries are either zeros or ones; 3) all its other entries are zeros.

We are going to prove that any matrix is equivalent to a matrix in Jordan form.

In order to understand this lecture, we should be familiar with the concepts introduced in the lectures on generalized eigenvectors and Jordan chains.

Table of Contents

Supradiagonal entries

As we have already said, the only non-zero entries of a matrix in Jordan form are located on its main diagonal and on the supradiagonal. The latter is the set of entries that are located immediately above the main diagonal.

Example Define[eq1]All the entries of $J$ are zero, except those on the supradiagonal, which are equal to 1.

Jordan block

We start by defining the basic building blocks of a matrix in Jordan form, called Jordan blocks.

Definition A $d	imes d$ matrix $J_{d,lambda }$ is said to be a Jordan block of dimension $d$ and eigenvalue $lambda $ if and only if its diagonal entries are all equal to $lambda $, its supradiagonal entries are all equal to 1, and all its other entries are equal to 0.

Thus, a Jordan block is completely specified by its dimension and its eigenvalue.

Example If the dimension is $d=3$ and the eigenvalue is $lambda =2$, then[eq2]

A Jordan block of dimension $d=1$ is a scalar, that is, [eq3].

Note that a Jordan block is upper triangular, and the diagonal entries of an upper triangular matrix are equal to its eigenvalues. This is the reason why $lambda $ is called the eigenvalue of the Jordan block $J_{d,lambda }$.

Jordan blocks and Jordan chains

We now present a first useful fact regarding Jordan blocks.

Proposition Let A be a $K	imes K$ matrix. Let $lambda $ be an eigenvalue of A. Let x be a generalized eigenvector of A associated to the eigenvalue $lambda $. Let $d$ be the smallest integer such that[eq4]Let [eq5]be the $K	imes d$ matrix whose columns are the vectors of the Jordan chain generated by x. Then,[eq6]where $J_{d,lambda }$ is a Jordan block of dimension $d$ and eigenvalue $lambda $.

Proof

We have[eq7]where $e_{1},ldots e_{d}$ are the vectors of the standard basis of the space of $d	imes 1$ vectors and we have repeatedly used the rules for multiplying block matrices. Therefore,[eq8]

Powers of Jordan blocks with zero eigenvalue

Let $J_{d,0}$ be a Jordan block with zero eigenvalue. When we post-multiply a $d	imes d$ matrix $M$ by $J_{d,0}$, we obtain a matrix whose:

The next example should clarify the reason why this is the case.

Example Let [eq9]and[eq10]As usual we can see the columns of $MJ_{3,0}$ as linear combinations of the columns of $M$ with coefficients taken from $J_{3,0}$. Thus, the first column of the product $MJ_{3,0}$ is[eq11]The second one is[eq12]and the third one is[eq13]As a consequence,[eq14]

Now, write [eq15]where, as before, [eq16] are the vectors of the standard basis of the space of $d	imes 1$ vectors.

Then, we can use the result just illustrated to derive the powers of the Jordan block:[eq17]

The last equation ($J_{d,0}^{d}=0$) will be repeatedly used below.

Direct sums of matrices

In what follows we are going to use the direct sum notation for matrices, which we have not yet used in these lecture notes.

If A and $B$ are two matrices, then $Aoplus B$ will denote the block-diagonal matrix having A and $B$ as its diagonal blocks:[eq18]

Note that although the notation is the same, the concept of direct sum of matrices is distinct from that of direct sum of subspaces.

Example Consider the Jordan blocks $J_{2,lambda _{1}}$ and $J_{2,lambda _{2}}$. Then, their direct sum is[eq19]

Definition of Jordan form

We now provide a simple definition of Jordan form.

Definition A matrix $J$ is said to be in Jordan form if and only if it can be written as a direct sum of Jordan blocks:[eq20]where [eq21] is a Jordan block for $j=1,ldots ,m$.

Here is an example.

Example Let [eq22], [eq23], [eq24] be Jordan blocks with [eq25]Then,[eq26]is a matrix in Jordan form.

Every matrix is similar to a matrix in Jordan form

Thanks to the results presented in the previous sections, we can easily derive the main result in this lecture.

Proposition Let A be a $K	imes K$ matrix. Then, there exists a $K	imes K$ invertible matrix $P$ such that[eq27]where $J$ is a matrix in Jordan form.

Proof

Let [eq28] be the distinct eigenvalues of A. For each eigenvalue $lambda _{j}$, choose a basis $B_{j}$ of Jordan chains for the generalized eigenspace[eq29]The existence of a basis $B_{j}$ of Jordan chains for each generalized eigenspace has been proved in the lecture on Jordan chains. Note that each basis can contain more than one chain. Denote by $nleft( j
ight) $ the number of Jordan chains in $B_{j}$ and by [eq30] their lengths. Denote by $P_{j,k}$ the matrix whose columns are the vectors of the k-th Jordan chain in $B_{j}$. Define the block matrices[eq31]Then,[eq32]where in step $rame{A}$ we have used the result about Jordan blocks and Jordan chains derived previously. We can now define[eq33]We have[eq34]where[eq35]is a matrix in Jordan form, being the direct sum of Jordan blocks. Since the generalized eigenspaces form a direct sum, the union of their bases is a linearly independent set. Therefore, the columns of $P$ are linearly independent and $P$ is invertible. Hence, we can rewrite the equation[eq36]that we have just derived as[eq37]

Note that the columns of the change-of-basis matrix $P$ that was built in the proof are generalized eigenvectors of A forming a basis for the space of Kx1 vectors.

The matrix $J$ in Jordan form, being a direct sum of upper triangular matrices, is itself an upper triangular matrix. As such, its diagonal elements are equal to its eigenvalues.

In turn, since A and $J$ are similar, they have the same eigenvalues. Hence, after performing a similarity transformation that transforms A into a matrix $J$ in Jordan form, we can read the eigenvalues of A on the main diagonal of $J$.

Change-of basis matrix

In the previous proposition we have shown that a matrix $P$ of generalized eigenvectors can be used as a change-of-basis matrix to transform A into a similar matrix $J$ in Jordan form.

Something more is true: all the change-of-basis matrices that transform A into a matrix $J$ in Jordan form are matrices of generalized eigenvectors.

Proposition Let A and $P$ be $K	imes K$ matrices. Let $P$ be invertible. Let [eq37]If $J$ is in Jordan form, then the columns of $P$ are generalized eigenvectors of A.

Proof

By the definition of $J$, we have[eq39]Let $P_{ullet j}$ be the $j$-th column of $P$. Then,[eq40]Two cases are possible: 1) [eq41], where $lambda $ is an eigenvalue of $J$ and $e_{j}$ is the $j$-th vector of the standard basis; 2) [eq42]. In the first case,[eq43]which implies that $P_{ullet j}$ is an eigenvector of A. In the second case,[eq44]or[eq45]If $P_{ullet j-1}$ is an eigenvector, then[eq46]and $P_{ullet j}$ is a generalized eigenvector. If not, then we are in case 2). By recursively applying the same reasoning we conclude that $P_{ullet j}$ is a generalized eigenvector.

Note that, trivially,[eq47]

Therefore, the columns of I, which are the vectors of the standard basis, are generalized eigenvectors of a matrix in Jordan form.

Minimal polynomial

The minimal polynomial of a matrix in Jordan form is easily derived as follows.

Proposition Let $J$ be a matrix in Jordan form whose distinct eigenvalues are [eq48]. For each $lambda _{j}$, let $
u _{j}$ be the dimension of the largest Jordan block of $J$ having eigenvalue $lambda _{j}$. Then, the minimal polynomial of $J$ is[eq49]

Proof

We first show that p is an annihilating polynomial for $J$. As before, suppose that[eq35]so that[eq51]for $l=1,ldots ,m$. Moreover, we have[eq52]and[eq53]In fact, each block[eq54]is equal to zero by the previously derived result on the powers of Jordan blocks with zero eigenvalue. Thus, all the diagonal blocks of [eq55] corresponding to the Jordan blocks of $J$ with eigenvalue $lambda _{l}$ are equal to zero. As a consequence, each diagonal block of the matrix[eq56]is the product of diagonal blocks, at least one of which is zero. Hence, [eq57]Not only p is annihilating but it is also monic. Suppose that p is not the minimal polynomial. Then, there exists an annihilating polynomial $q$ that has lower degree than p and divides p. Suppose that[eq58]where [eq59] for $j=1,ldots ,m$ and there exists an index $l$ such that [eq60]. Without loss of generality, we can assume that $l=m$ (otherwise, we can change the order of the factors). Thus, there is a diagonal block of [eq61], equal to [eq62], that is different from zero (because, as explained above, we need to raise a [eq63] Jordan block with eigenvalue zero at least to the $
u _{m}$-th power to get the zero matrix). All the blocks of [eq64], ..., [eq65] corresponding to said non-zero block have non-zero entries on their diagonals. Therefore[eq66]which contradicts the hypothesis that $q$ is an annihilating polynomial. Hence, p is the minimal polynomial.

Since similar matrices have the same minimal polynomial, we can derive the minimal polynomial of a matrix A by first finding a matrix $J$ in Jordan form that is similar to A and then using the above proposition to find the minimal polynomial.

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Find the minimal polynomial of[eq67]

Solution

The Jordan blocks of $J$ are[eq68]Hence, the minimal polynomial of $J$ is[eq69]

Exercise 2

Find the minimal polynomial of[eq70]

Solution

The Jordan blocks of $J$ are[eq71]Therefore, the minimal polynomial of $J$ is[eq72]

How to cite

Please cite as:

Taboga, Marco (2021). "Jordan form", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/Jordan-form.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.