Search for probability and statistics terms on Statlect
StatLect

Invariant subspace

by , PhD

A subspace is said to be invariant under a linear operator if its elements are transformed by the linear operator into elements belonging to the subspace itself.

The kernel of an operator, its range and the eigenspace associated to the eigenvalue of a matrix are prominent examples of invariant subspaces.

The search for invariant subspaces is one of the most important themes in linear algebra. The reason is simple: as we will see below, the matrix representation of an operator with respect to a basis is greatly simplified (i.e., it becomes block-triangular or block-diagonal) if some of the vectors of the basis span an invariant subspace.

Table of Contents

Definition

Remember that, given a vector space $S$, a linear operator is a function $f:S
ightarrow S$ that preserves linear combinations, that is,[eq1]for any couple of vectors $s_{1},s_{2}in S$ and any couple of scalars [eq2].

Definition Let $S$ be a vector space and $f:S
ightarrow S$ a linear operator. Let $U$ be a subspace of $S$. We say that $U$ is invariant under $f$ if and only if [eq3] for any $uin U$.

In other words, if $U$ is invariant under $f$, then the restriction of $f$ to $U$, denoted by [eq4], is a linear operator on $U$ (i.e., [eq5]).

Example Let $S$ be the space of $2	imes 1$ vectors. Let $U$ be the subspace spanned by the vector[eq6]In other words, all the vectors of $U$ have the form[eq7]where $lpha $ is a scalar. Suppose that a linear operator $f:S
ightarrow S $ is such that[eq8]Then, whenever $uin U$, we have [eq9]. Therefore, $U$ is an invariant subspace under $f$.

The kernel of an operator is invariant

The kernel of a linear operator $f:S
ightarrow S$ is the subspace[eq10]

Since [eq11] and all the elements of $QTR{rm}{null}f$ are mapped into 0 by the operator $f$, the kernel $QTR{rm}{null}f$ is invariant under $f$.

The range of an operator is invariant

The range of a linear operator $f:S
ightarrow S$ is the subspace[eq12]

Since [eq13], any [eq14] is mapped by $f$ into $QTR{rm}{range}f$. Therefore, the range is invariant.

The eigenspace of an eigenvalue is invariant

Let $S$ be the space of Kx1 vectors. Let A be a $K	imes K$ matrix. We can use the matrix A to define a linear operator $f:S
ightarrow S$ as follows:[eq15]

Suppose $lambda $ is an eigenvalue of A and $U$ is the subspace of $S$ containing all the eigenvectors associated to $lambda $ (so-called eigenspace).

By the definition of eigenvector, we have[eq16]for any $uin U$. Since $U$ is a subspace, $lambda uin U$. Therefore, the eigenspace $U$ is invariant under $f$.

Block-triangular matrices

There is a tight link between invariant subspaces and block-triangular matrices.

In order to understand this link, we need to revise some facts about linear operators.

Let $S$ be a finite-dimensional vector space and [eq17] a basis for $S$.

If $sin S$ can be written as a linear combination of the basis as[eq18]then its coordinate vector with respect to $B$ is[eq19]

Remember that any operator $f:S
ightarrow S$ has an associated matrix, called matrix of the operator with respect to $B$ and denoted by [eq20], such that, for any $sin S$, we have[eq21]where [eq22] and [eq23] are respectively the coordinate vectors of $s$ and $fleft( s
ight) $ with respect to $B$.

We have previously proved that the matrix of the operator has the following structure:[eq24]

We are now ready to state the main proposition in this lecture.

Proposition Let $S$ be a finite-dimensional vector space and $f:S
ightarrow S$ a linear operator. Let $U$ be a subspace of $S$ and [eq25] a basis for $U$. Complete $B_{U}$ so as to form a basis [eq26] for $S$. The subspace $U$ is invariant under $f$ if and only if [eq27] has the block-triangular structure[eq28]where the block $arphi _{11}$ is $N	imes N$, $arphi _{12}$ is [eq29], $arphi _{22}$ is [eq30] and 0 denotes a [eq31] block of zeros.

Proof

We first prove the "only if part", starting from the hypothesis that $U$ is invariant. Denote by $f_{kl}$ the $left( k,l
ight) $-th entry of [eq32]. Since $U$ is invariant, then, for $kleq N$, [eq33] belongs to $U$ and, as a consequence, it can be written as a linear combination of [eq34] (and [eq35] enter with zero coefficient in the linear combination). Therefore, for $kleq N$, the coordinate vector of [eq36] is[eq37]As a consequence, when $U$ is invariant, the matrix of the operator is[eq38]We now prove the "if part", starting from the hypothesis that [eq39] has the assumed block-triangular structure. Any vector $uin U$ has a coordinate vector of the form[eq40]where $u_{1}$ is $N	imes 1$ and 0 is [eq41]. Then,[eq42]Therefore, [eq43]. Since this is true for any $uin U$, $U$ is an invariant subspace.

We can also write[eq44]where [eq45] is the matrix of the restriction of $f$ to $U$ with respect to the basis $B_{U}$.

Direct sums of invariant subspaces

Remember that $S$ is said to be the sum of $m$ subspaces [eq46], in which case we write[eq47]if and only if[eq48]

A sum of subspaces like the one just shown is said to be a direct sum and it is denoted by[eq49]if and only if [eq50] are linearly independent whenever $u_{j}in U_{j}$ and $u_{j}
eq 0$ for $j=1,ldots ,m$.

Direct sums of invariant subspaces have the following important property.

Proposition Let $S$ be a linear space. Let $U_{1}$ and $U_{2}$ be subspaces of $S$ such that[eq51]Let [eq52] and [eq53] be bases for $U_{1}$ and $U_{2}$ respectively (as a consequence, $B=B_{1}cup B_{2}$ is a basis for $S$). Let $f:S
ightarrow S$ be a linear operator. Then, $U_{1}$ and $U_{2}$ are both invariant under $f$ if and only if [eq54] has the block-diagonal structure[eq55]where the blocks $arphi _{11}$ and $arphi _{22}$ are $N	imes N$ and [eq56] respectively.

Proof

We first prove the "only if" part, starting from the hypothesis that $U_{1}$ and $U_{2}$ are both invariant under $f$ . By the properties of direct sums, any vector $sin S$ has a unique representation[eq57]where $u_{1}in U_{1}$ and $u_{2}in U_{2}$. Moreover, $u_{j}$ has a unique representation in terms of the basis $B_{j}$ (for $j=1,2$). Therefore, any $sin S$ can be written as a linear combination of the vectors of $B$. In other words, $B$ is a basis for $S$. The first $N$ columns of [eq39] are[eq59]Since $U_{1}$ is invariant under $f$, $b_{k}in U_{1}$ implies that [eq60]. Therefore, for $kleq N$, [eq61] can be written as a linear combination of the vectors of $B_{1}$ and the first $N$ columns of [eq62] are[eq63]Similarly, we can demonstrate that the remaining columns of [eq64] are[eq65]Thus,[eq66]which is a block-diagonal matrix with the structure described in the proposition. We now prove the "if" part, starting from the hypothesis that [eq67] is block diagonal. Since [eq68] is block-upper triangular, $U_{1}$ is invariant by the proposition above on block-upper triangular matrices. Moreover, any vector $uin U_{2}$ has a coordinate vector of the form[eq69]where $u_{2}$ is [eq70] and 0 is $N	imes 1$. Then,[eq71]Therefore, [eq72]. Since this is true for any $uin U_{2}$, $U_{2}$ is an invariant subspace.

More than two subspaces

The previous proposition can be extended, by applying it recursively, to the case in which[eq73] and all the subspaces $U_{j}$ are invariant.

Proposition Let $S$ be a linear space. Let $U_{1}$, $U_{2}$, ...,$U_{m}$ be subspaces of $S$, with bases $B_{1}$, ..., $B_{m}$, and such that[eq74]so that, as a consequence, [eq75]is a basis for $S$. Let $f:S
ightarrow S$ be a linear operator. Then, all the sets $U_{j}$ (for $j=1,ldots ,m$) are invariant under $f$ if and only if [eq68] has the block-diagonal structure[eq77]

Practical implications

What are the practical implications of everything that we have shown so far? In particular, what happens when we are dealing with linear operators defined by matrices? We provide some answers in this section.

Let A be a $K	imes K$ matrix. Let $S$ be the space of all Kx1 column vectors.

We consider the linear operator $f:S
ightarrow S$ defined by the matrix A, that is,[eq78]for any $sin S$.

Suppose that we have been able to find two invariant subspaces $U_{1}$ and $U_{2}$ such that[eq79]

In other words,[eq80]and $u_{1}$ is linearly independent from $u_{2}$ whenever $u_{1}in U_{1}$, $u_{2}in U_{2}$ and the two vectors are non-zero.

We can choose bases [eq52] and [eq82] for $U_{1}$ and $U_{2}$ respectively and we know that $B=B_{1}cup B_{2}$ is a basis for $S$.

Define the following matrices by adjoining the vectors of the basis:[eq83]

Note that [eq84]where [eq85] are $N	imes 1$ vectors that are guaranteed to exist because $Ab_{j}in U_{1}$ for $jleq N$ by the invariance of $U_{1}$, which means that $Ab_{j}$ can be written as a linear combination of the basis of $U_{1}$ (i.e., $Ab_{j}=V_{1}h_{j}$). In order to match the notation used in the propositions above, we define[eq86]so that[eq87]

Similarly, we can find a matrix $arphi _{22}$ such that[eq88]

As a consequence, we have[eq89]or[eq90]where V is invertible because its columns are vectors of a basis, which are by definition linearly independent.

Recall the definition of matrix similarity. The last equation means that A is similar to the block-diagonal matrix[eq91]and the change-of-basis matrix used in the similarity transformation is V.

Summary of the workflow

Thus, the process of similarity transformation of a matrix A into a block-diagonal matrix $arphi $ (generalized here to the case of more than two invariant subspaces) works as follows:

This is one of the most important workflows in linear algebra! We encourage the reader to solidly understand and memorize it.

Eigenvalues and eigenvectors again

We have explained above that the eigenspace associated to an eigenvalue of $A $ is an invariant subset.

Denote by [eq97] the distinct eigenvalues of A and by [eq98] their respective eigenspaces.

As explained in the lecture on the linear independence of eigenvectors, when A is not defective, we can form a matrix[eq99]where the columns of $V_{j}$ are a basis for $U_{j}$ and all the columns of V together are a basis for the space $S$ of all Kx1 column vectors. As a consequence,[eq93]

Thus, we can use the matrix of eigenvectors V to perform a similarity transformation and obtain the block-diagonal matrix[eq101]

Actually, in the lecture on matrix diagonalization, we have proved that $arphi $ is a diagonal matrix having the eigenvalues A on its main diagonal.

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Define the matrix[eq102]

Verify that[eq103]is an invariant subspace under the linear transformation defined by A.

Solution

Any vector $uin U$ takes the form[eq104]where $lpha $ is a scalar. Then,[eq105]As a consequence, $Auin U$ and $U$ is an invariant subspace.

Exercise 2

Let $S$ be the space of $3	imes 1$ column vectors. Define the matrix[eq106]

By simply inspecting A, can you find two subspaces $U_{1}$ and $U_{2}$ such that[eq107]and $U_{1}$ and $U_{2}$ are invariant under the linear transformation defined by A?

Solution

Note that A is a block-diagonal matrix:[eq108]where[eq109]Therefore, two complementary invariant subspaces are[eq110]

How to cite

Please cite as:

Taboga, Marco (2021). "Invariant subspace", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/invariant-subspace.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.