Search for probability and statistics terms on Statlect
StatLect
Index > Matrix algebra

Properties of eigenvalues and eigenvectors

by , PhD

This lecture discusses some of the properties of the eigenvalues and eigenvectors of a square matrix.

Table of Contents

Left eigenvectors

The first property concerns the eigenvalues of the transpose of a matrix.

Proposition Let A be a $K	imes K$ square matrix. A scalar $lambda $ is an eigenvalue of A if and only if it is an eigenvalue of $A^{intercal }$.

Proof

Remember that a scalar $lambda $ is an eigenvalue of A if and only if it solves the characteristic equation[eq1]where $det $ denotes the determinant. We know that transposition does not change the determinant. Thus,[eq2]Therefore, $lambda $ is an eigenvalue of A if and only if[eq3]which is verified if and only if $lambda $ is also an eigenvalue of $A^{intercal }$.

Even if A and $A^{intercal }$ have the same eigenvalues, they do not necessarily have the same eigenvectors.

If $y$ is an eigenvector of the transpose, it satisfies[eq4]

By transposing both sides of the equation, we get[eq5]

The row vector $y^{intercal }$ is called a left eigenvector of A.

Eigenvalues of a triangular matrix

The diagonal elements of a triangular matrix are equal to its eigenvalues.

Proposition Let A be a $K	imes K$ triangular matrix. Then, each of the diagonal entries of A is an eigenvalue of A.

Proof

Let $lambda $ be a scalar. Then[eq6]is triangular because adding a scalar multiple of the identity matrix to $-A$ only affects the diagonal entries of $-A$. In particular, if $A_{kk}$ is a diagonal entry of A, then $lambda -A_{kk}$ is a diagonal entry of $lambda I-A$. Since the determinant of a triangular matrix is equal to the product of its diagonal entries, we have that[eq7]Since the eigenvalues of A satisfy the characteristic equation[eq8]we have that $lambda $ is an eigenvalue of A if one of the terms [eq9] of the above product is equal to zero, that is, if $lambda =A_{kk}$ for some k.

Zero eigenvalues and invertibility

Eigenvalues allow us to tell whether a matrix is invertible.

Proposition Let A be a $K	imes K$ matrix. Then A is invertible if and only if it has no zero eigenvalues.

Proof

We know that $lambda $ is an eigenvalue of A if and only if it satisfies the characteristic equation[eq10]Therefore, 0 is not an eigenvalue of A if and only if[eq11]which happens if and only if A is invertible (see the section on the determinant of a singular matrix).

Eigenvalues and eigenvectors of the inverse matrix

The eigenvalues of the inverse are easy to compute.

Proposition Let A be a $K	imes K$ invertible matrix. Then $lambda $ is an eigenvalue of A corresponding to an eigenvector x if and only if $lambda ^{-1}$ is an eigenvalue of $A^{-1}$ corresponding to the same eigenvector x.

Proof

A scalar $lambda $ is an eigenvalue of A corresponding to an eigenvector x if and only if [eq12]Since A is invertible, $lambda 
eq 0$ and we can multiply both sides of the equation by [eq13], so as to obtain[eq14]or[eq15]which is true if and only if $lambda ^{-1}$ is an eigenvalue of $A^{-1}$ associated to the eigenvector x.

Conjugate pairs

An interesting fact is that complex eigenvalues of real matrices always come in conjugate pairs.

Proposition Let A be a $K	imes K$ matrix having real entries. A complex number $lambda $ is an eigenvalue of A corresponding to the eigenvector x if and only if its complex conjugate [eq16] is an eigenvalue corresponding to the conjugate vector $overline{x}$.

Proof

A scalar $lambda $ is an eigenvalue of A corresponding to an eigenvector x if and only if [eq17]By taking the complex conjugate of both sides of the equation, we obtain[eq18]Since A is real, it is equal to its complex conjugate. Therefore,[eq19]that is, [eq20] is an eigenvalue of A corresponding to the eigenvector $overline{x}$.

Scalar multiples

If we multiply a matrix by a scalar, then all its eigenvalues are multiplied by the same scalar.

Proposition Let A be a $K	imes K$ matrix and $lpha 
eq 0$ a scalar. If $lambda $ is an eigenvalue of A corresponding to the eigenvector x, then $lpha lambda $ is an eigenvalue of $lpha A$ corresponding to the same eigenvector x.

Proof

A scalar $lambda $ is an eigenvalue of A corresponding to an eigenvector x if and only if [eq21]If we multiply both sides of the equation by the scalar $lpha $, we get[eq22]which is true if and only if $lpha lambda $ is an eigenvalue of $lpha A$ corresponding to the eigenvector x.

Matrix powers

Let n be a natural number. The n-th power of a square matrix A is[eq23]

In other words, the n-th power is obtained by performing n matrix multiplications of A by itself.

It is easy to derive the eigenvalues of $A^{n}$ from those of A.

Proposition Let A be a $K	imes K$ matrix. If $lambda $ is an eigenvalue of A corresponding to the eigenvector x, then $lambda ^{n}$ is an eigenvalue of $A^{n}$ corresponding to the same eigenvector x.

Proof

A scalar $lambda $ is an eigenvalue of A corresponding to an eigenvector x if and only if [eq24]If we pre-multiply both sides of the equation by A, we get[eq25]If we again pre-multiply both sides by A, we obtain[eq26]We can proceed in this manner until we get[eq27]which is true if and only if $lambda ^{n}$ is an eigenvalue of $A^{n}$ corresponding to the eigenvector x.

All the eigenvalues of a Hermitian matrix are real

Remember that a matrix A is said to be Hermitian if and only if it equals its conjugate transpose:[eq28]

Hermitian matrices have the following nice property.

Proposition Let A be a $K	imes K$ matrix. If A is Hermitian, then all its eigenvalues are real (i.e., their complex parts are zero).

Proof

Arbitrarily choose an eigenvalue $lambda $ and one of its associated eigenvectors x. By the definition of eigenvector, $x
eq 0$. Note that[eq29]where [eq30] denotes the norm of x. If we take the conjugate transpose of both sides of the equation just derived, we obtain[eq31]where we have used the fact that the norm is a real number and, as a consequence, complex conjugation leaves it unaffected. Moreover, we can replace $A^{st }$ in the last equation with A because A is Hermitian. Thus, we have[eq32]and[eq33]But [eq34] implies that $lambda $ has zero complex part.

All the eigenvalues of a symmetric real matrix are real

If a real matrix A is symmetric (i.e., $A=A^{	op }$), then it is also Hermitian (i.e., [eq35]) because complex conjugation leaves real numbers unaffected. Therefore, by the previous proposition, all the eigenvalues of a real symmetric matrix are real.

The trace is equal to the sum of eigenvalues

Remember that the trace of a matrix is the sum of its diagonal entries.

Proposition Let A be a $K	imes K$ matrix and [eq36] its eigenvalues. Then,[eq37]

Proof

To make this proof as simple as possible, we use the concepts of similarity and Schur decomposition, which we have not yet introduced. You might want to skip this proof now and read it after studying these two concepts. By the Schur decomposition, $A $ is unitarily similar to an upper triangular matrix $T$. When two matrices are similar, they have the same trace and the same eigenvalues. Moreover, because $T$ is triangular, its diagonal entries are its eigenvalues. Therefore,[eq38]

The determinant is equal to the product of eigenvalues

The next important result links the determinant of a matrix to its eigenvalues.

Proposition Let A be a $K	imes K$ matrix and [eq36] its eigenvalues. Then,[eq40]

Proof

As in the previous proof, we use the concepts of similarity and Schur decomposition. By the Schur decomposition, A is unitarily similar to an upper triangular matrix $T$. Two similar matrices have the same determinant and the same eigenvalues. Moreover, because $T$ is triangular, its diagonal entries are its eigenvalues and its determinant is equal to the product of its diagonal entries. Therefore,[eq41]

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Define[eq42]

Find the eigenvalues of [eq43]

Solution

Since A is triangular, its eigenvalues are equal to its diagonal entries. Therefore, the eigenvalues of A are[eq44]Transposition does not change the eigenvalues and multiplication by $2$ doubles them. Thus, the eigenvalues of [eq45] are[eq46]Those of the inverse [eq47] are [eq48]and those of [eq49] are[eq50]

How to cite

Please cite as:

Taboga, Marco (2017). "Properties of eigenvalues and eigenvectors", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/properties-of-eigenvalues-and-eigenvectors.

The book

Most of the learning materials found on this website are now available in a traditional textbook format.