Search for probability and statistics terms on Statlect
StatLect

Jordan chain

by , PhD

A Jordan chain is a set of generalized eigenvectors that are obtained by repeatedly applying a nilpotent operator to the same vector.

In order to understand this lecture, we should be familiar with the concepts introduced in the lectures on cyclic subspaces and generalized eigenvectors.

Table of Contents

Definition

Here is a formal definition.

Definition Let A be a $K	imes K$ matrix. Let $lambda $ be an eigenvalue of A. Let x be a generalized eigenvector of A associated to the eigenvalue $lambda $. Let k be the smallest integer such that[eq1]Then, the set of vectors[eq2]is called the Jordan chain generated by x.

Jordan chains enjoy a number of useful properties that are presented in the propositions below. We invite the reader to try and prove those propositions as an exercise before reading the proofs that are provided.

All the vectors of a chain are generalized eigenvectors

A Jordan chain is a set of generalized eigenvectors.

Proposition Let A be a $K	imes K$ matrix. Let x be a generalized eigenvector of A. Then, all the vectors of the Jordan chain generated by x are generalized eigenvectors of A.

Proof

Let k be the smallest integer such that[eq3]We can equivalently write[eq4]for $j=1,ldots ,k-1$. Moreover, [eq5]by the definition of k, for $j=1,ldots ,k-1$. Therefore, the vectors [eq6], which are the vectors of the Jordan chain, satisfy the definition of generalized eigenvectors of A.

The last vector of a chain is an eigenvector

A Jordan chain has another elementary property.

Proposition Let A be a $K	imes K$ matrix. Let x be a generalized eigenvector of A associated to the eigenvalue $lambda $. Then, the last vector of the Jordan chain generated by x (i.e., [eq7]) is an eigenvector of A.

Proof

Let k be the smallest integer such that[eq8]Then,[eq9]and[eq10]Thus, [eq11] is an eigenvector of A.

Using eigenvectors to form a Jordan chain

In light of the last proposition, we can also see a Jordan chain as generated by 1) starting from an eigenvector of A (which occupies the rightmost position in the chain) and 2) iteratively solving systems of equations so as to move leftward in the chain.

Proposition Let A be a $K	imes K$ matrix and $lambda $ one of its eigenvalues. Let $y_{0}$ be an eigenvector associated to $lambda $. Let k be the largest integer such that all the systems[eq12]have a solution for $j=1,ldots ,k-1$. Then,[eq13]is the Jordan chain generated by $y_{k-1}$.

Proof

Note that all the vectors of the chain are non-zero because the solution $y_{j}$ of [eq14]cannot be zero when $y_{j-1}$ is non-zero and the starting value $y_{0}$ is non-zero being an eigenvector. The $j$-th vector of the chain is[eq15]We have[eq16]because $y_{0}$ is an eigenvector of A associated to $lambda $. Therefore, $y_{k-1}$ is a generalized eigenvector of A. Moreover, k is the smallest integer such that[eq17]because the vectors [eq18] are non-zero. Thus, [eq19] is the Jordan chain generated by $y_{k-1}$.

We call a Jordan chain generated by starting from an eigenvector and going backwards a backwardly-generated Jordan chain.

Clearly, a backwardly-generated chain is the longest chain ending with the eigenvector used to generate it.

Nilpotent operator

In what follows, it will be useful to consider the linear operator[eq20]defined by[eq21]for any [eq22].

Note that the operator is well-defined (in particular, [eq23] can be taken to be the codomain of $f $) because of the invariance property discussed in the lecture on generalized eigenvectors.

Note that [eq24] is the generalized eigenspace associated to $lambda $.

Importantly, $f$ is a nilpotent operator because [eq25]for any [eq22], by the very definition of [eq27].

Jordan chains as cycles

Having defined the nilpotent operator $f$, we can view a Jordan chain[eq2]as a cycle[eq29]and we can use the previously introduced theory of cycles to derive further important properties of Jordan chains.

Linear independence

We now present the first straightforward applications of the theory of cycles to Jordan chains.

Proposition A Jordan chain is a set of linearly independent vectors.

Proof

A Jordan chain is a cycle generated by applying increasing powers of a nilpotent operator to a non-zero vector, and such cycles are linearly independent.

Proposition Let A be a $K	imes K$ matrix. Let $lambda $ be an eigenvalue of A. Let [eq30] be generalized eigenvectors associated to $lambda $. Let [eq31] be the final vectors of the Jordan chains [eq32], which are ordinary eigenvectors of A (as demonstrated above). If the set [eq33]is linearly independent, then also the set [eq34]is linearly independent.

Proof

This is just a re-statement of the analogous proposition for cycles.

Basis for the generalized eigenspace

The next proposition shows that Jordan chains can be used to form a basis for the generalized eigenspace corresponding to a given eigenvalue.

Proposition Let A be a $K	imes K$ matrix. Let $lambda $ be an eigenvalue of A. Then, there exist generalized eigenvectors [eq35] associated to $lambda $ such that [eq34]is a basis for the generalized eigenspace [eq27], where [eq38] is the Jordan chain generated by $x_{j}$.

Proof

Let $f$ be the nilpotent operator defined above. From the theory of cycles, we know that there is a non-overlapping union of cycles that forms a basis for the domain of $f$. But the domain of $f$ is the generalized eigenspace and $f$-cycles are Jordan chains. Hence the stated result.

Length of the chains

From previous lectures, we know that[eq39]where $
u leq K$ and $
u $ is the exponent corresponding to the eigenvalue $lambda $ in the minimal polynomial of A.

We also know that the highest-ranking generalized eigenvector associated to $lambda $ has rank $
u $. In other words, there is no vector x satisfying[eq40]for $k>
u $, but there is at least one vector x satisfying the two conditions above for $k=
u $, which generates the Jordan chain[eq41]

Hence, $
u $ is the length of the longest Jordan chain formed by the generalized eigenvectors associated to $lambda $ (beyond being the index of nilpotency of $f$ and the index of the matrix $A-lambda I$).

Multiple chains

As proved in previous lectures, the dimension of the generalized eigenspace[eq42]is equal to the algebraic multiplicity of $lambda $, denoted by mu in what follows.

The integer mu is the exponent corresponding to $lambda $ in the characteristic polynomial. We know that $
u $ can be smaller than mu. Therefore, a single Jordan chain (whose maximum length is $
u $) may not be enough to span the whole generalized eigenspace. This is the reason why the previous proposition guarantees that a basis of the generalized eigenspace can formed by merging multiple Jordan chains.

We urge the reader to carefully think about all the "characteristics" of a generalized eigenspace:

When we know some of these characteristics we can often deduce other interesting facts about the generalized eigenspace.

Example Let A be a $4	imes 4$ matrix with characteristic polynomial[eq43]and minimal polynomial[eq44]Thus, A has an eigenvalue $lambda =3$ with algebraic multiplicity $mu =3$. We know that the geometric multiplicity of $lambda $ must be less than $3$ because otherwise in the minimal polynomial the exponent $
u =2$ of the linear factor $left( z-3
ight) $ would be 1 (as explained in the lecture on the Primary decomposition theorem). In other words, $lambda $ is defective. Can its geometric multiplicity be 1? If it was, any couple of eigenvectors would be linearly dependent. Therefore, any basis for [eq45] would include only one eigenvector. As a consequence, it would be formed by a single Jordan chain (because two or more independent chains terminate with linearly independent eigenvectors). But any basis of [eq46] is formed by $mu =3$ linearly independent generalized eigenvectors. Therefore, the length of the chain would need to be equal to $3$. This is impossible because the maximum length of a chain is $
u =2$. As a consequence, the geometric multiplicity of $lambda $ must be equal to $2$.

Primary decomposition and Jordan chains

It is now time to revisit once again the Primary Decomposition Theorem.

Let $S$ be the space of all Kx1 vectors and A a $K	imes K$ matrix.

According to the Primary Decomposition Theorem, the vector space $S$ can be written as a direct sum of generalized eigenspaces:[eq47]where [eq48] are the distinct eigenvalues of A and [eq49] are the strictly positive integers that appear in the minimal polynomial.

Thus, a basis for $S$ can be formed as a union of bases for the generalized eigenspaces. In turn, each of the generalized eigenspaces has a basis formed by a union of Jordan chains (as proved above). By putting these two facts together, we obtain the following proposition.

Proposition Let $S$ be the space of all Kx1 vectors. Let A be a matrix. Then, there exists a basis for $S$ formed by Jordan chains generated by the generalized eigenvectors of A.

It is important to note that the basis in this proposition comprises at least one Jordan chain for each eigenvalue, but more than one chain may be necessary to span some generalized eigenspaces, as discussed previously. So, the total number of chains in the basis may exceed the number of eigenvalues.

How to cite

Please cite as:

Taboga, Marco (2021). "Jordan chain", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/Jordan-chain.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.