Search for probability and statistics terms on Statlect
StatLect
Index > Matrix algebra

Similar matrix

by , PhD

Two square matrices are said to be similar if they represent the same linear operator under different bases. Two similar matrices have the same rank, trace, determinant and eigenvalues.

Table of Contents

Definition

We start with a definition.

Proposition A $K	imes K$ matrix A is said to be similar to another $K	imes K$ matrix $B$ if and only if there exists an invertible $K	imes K$ matrix $P$ such that[eq1]

The transformation of A into $P^{-1}AP$ is called similarity transformation.

The matrix $P$ is called change-of-basis matrix.

Relation to change of basis

In order to understand the relation between similar matrices and changes of bases, let us review the main things we learned in the lecture on the Change of basis.

Let $S$ be a finite-dimensional vector space and [eq2] a basis for $S$.

Any vector $sin S$ can be represented as a linear combination of the basis[eq3]where [eq4] are scalar coefficients.

The coefficients of the linear combination form the so-called coordinate vector of $s$ with respect to $B$, denoted by [eq5]:[eq6]

If we use a different basis [eq7], then the coordinates of any vector $s$ with respect to $C$ satisfy[eq8]where the $K	imes K$ matrix $S_{C
ightarrow B}$ is called change-of-basis matrix and allows to convert coordinates with respect to $C$ into coordinates with respect to $B$.

Remember that a linear operator $f:S
ightarrow S$ can always be represented by a $K	imes K$ matrix [eq9] such that, for any $sin S$,[eq10]

In other words, if we pre-multiply the coordinates of $s$ with respect to $B$ by [eq11], we get the coordinates of $fleft( s
ight) $ as a result.

We have shown that the matrices of the linear operator under different bases are related to each other by the change-of-basis formula[eq12]

Thus, [eq13] is similar to [eq14]. This result explains the characterization of similarity we have given in the introduction above: two similar matrices represent the same linear operator under different bases.

Equivalence relation

Similarity defines an equivalence relation between square matrices.

Proposition Matrix similarity is an equivalence relation, that is, given three $K	imes K $ matrices A, $B$ and $C$, the following properties hold:

  1. Reflexivity: A is similar to itself;

  2. Symmetry: if A is similar to $B$, then $B$ is similar to A;

  3. Transitivity: if A is similar to $B$ and $B$ is similar to $C$, then A is similar to $C$.

Proof

Similarity is reflexive because[eq15]where the identity matrix I is the change-of-basis matrix. Symmetry holds because the equation[eq16]implies[eq17]where $P^{-1}$ is the change-of-basis matrix. Transitivity holds because[eq18]imply[eq19]where $P_{1}P_{2}$ is the change-of-basis matrix.

Same rank

The next proposition shows a first important property of similarity.

Proposition If two matrices are similar, then they have the same rank.

Proof

Let A and $B$ be similar, so that $B=P^{-1}AP$, with $P$ invertible (hence full-rank). As proved in the lecture on matrix product and rank, [eq20]because $P$ is full-rank and [eq21]because $P^{-1}$ is full-rank. Therefore, A and $B$ have the same rank.

Same trace

Trace is preserved by similarity transformations.

Proposition If two matrices are similar, then they have the same trace.

Proof

Let $B=P^{-1}AP$. Then, by an elementary property of the trace, we have that[eq22]

Same determinant

The next property concerns the determinant.

Proposition If two matrices are similar, then they have the same determinant.

Proof

Let $B=P^{-1}AP$. We have[eq23]where in steps $rame{A}$ and $rame{B}$ we have used two properties of the determinant: 1) the determinant of a product of two or more matrices is equal to the product of their determinants; 2) [eq24].

Same eigenvalues

This is probably the most important property, as well as the reason why similarity transformations are so important in the theory of eigenvalues and eigenvectors.

Proposition If two matrices are similar, then they have the same eigenvalues, with the same algebraic and geometric multiplicities.

Proof

Let A and $B$ be similar, so that $B=P^{-1}AP$. Any eigenvalue $lambda $ of A solves the characteristic equation[eq25]while the eigenvalues of $B$ solve the equation[eq26]where in steps $rame{A}$ and $rame{B}$ we have used two properties of the determinant: 1) the determinant of a product of two or more matrices is equal to the product of their determinants; 2) [eq24]. Thus, $lambda $ solves the characteristic equation of A if and only if it solves the characteristic equation of $B$. Stated differently, $lambda $ is an eigenvalue of A if and only if it is an eigenvalue of $B$. Moreover, since A and $B$ have the same characteristic equation, their eigenvalues have the same algebraic multiplicities. We still need to prove that the eigenvalues of A and $B$ have the same geometric multiplicities. Note that

[eq28]For a given eigenvalue $lambda $, choose an eigenvector of $B$ associated to $lambda $ and denote it by x. Then,[eq29]If we post-multiply equation (1) by x, we get[eq30]or[eq31]In other words, $Px$ is an eigenvector of A associated to $lambda $ if and only if x is an eigenvector of $B$ associated to $lambda $. Suppose that $lambda $, as an eigenvector of $B$, has geometric multiplicity equal to $L$. Choose a basis [eq32] for the eigenspace of $B$ associated to $lambda $ (i.e., any eigenvector of $B$ associated to $lambda $ can be written as a linear combination of [eq33]). Let Z be the $K	imes L$ matrix obtained by adjoining the vectors of the basis:[eq34]Thus, the eigenvectors of $B$ associated to $lambda $ satisfy the equation[eq35]where $h$ is the $L	imes 1$ vector of coefficients of the linear combination. If we pre-multiply both sides of the equation by $P$, we get[eq36]Thus the eigenvectors $Px$ of A associated to $lambda $ are all the vectors that can be written as linear combinations of the columns of $PZ$. But $PZ$ has the same rank of Z because $P$ is full-rank. Therefore, it has rank $L$. So the geometric multiplicity of $lambda $, as an eigenvector of A is $L$, the same it has as an eigenvector of $B$.

Note from the previous proof that if[eq37]then $lambda $ is an eigenvalue of A if and only if it is an eigenvalue of $B$, but x is an eigenvector of $B$ associated to $lambda $ if and only if $Px$ is an eigenvector of A associated to $lambda $.

Unitarily similar

In linear algebra we often use the term "unitarily similar".

Definition Two $K	imes K$ matrices A and $B$ are said to be unitarily similar if and only if there exists a $K	imes K$ unitary matrix $P$ such that [eq38]

Thus, two matrices are unitarily similar if they are similar and their change-of-basis matrix is unitary.

Since the inverse of a unitary matrix $P$ is equal to its conjugate transpose $P^{st }$, the similarity transformation can be written as[eq39]

When all the entries of the unitary matrix $P$ are real, then the matrix is orthogonal, [eq40] and the similarity transformation becomes[eq41]

Similar matrix powers

The following proposition illustrates a simple but very useful property of similarity.

Proposition If two matrices A and $B$ are similar, then their n-th powers $A^{n}$ and $B^{n}$ are similar.

Proof

Let $B=P^{-1}AP$. We have[eq42]

The proof also shows that the change-of-basis matrix employed in the similarity transformation of A into $B$ is the same used in the similarity transformation of $A^{n}$ into $B^{n}$.

How to cite

Please cite as:

Taboga, Marco (2017). "Similar matrix", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/similar-matrix.

The book

Most of the learning materials found on this website are now available in a traditional textbook format.