Search for probability and statistics terms on Statlect
StatLect

Inverse of a matrix

by , PhD

The concept of inverse of a matrix is a multidimensional generalization of the concept of reciprocal of a number:

Table of Contents

Definition

Let us start with a definition of inverse.

Definition Let A be a $K	imes K$ matrix. Its inverse, if it exists, is the $K	imes K $ matrix $A^{-1}$ that satisfies[eq1]where I is the $K	imes K$ identity matrix. If $A^{-1}$ exists, then we say that A is invertible.

When $K=1$, then $I=1$ and [eq2]which makes clear that the definition above generalizes the notion of reciprocal of a number.

Example Consider the matrix [eq3]Then, we can verify that[eq4]by carrying out the multiplication between the two matrices:[eq5]

Existence of the inverse

Under what conditions is a square matrix invertible? The next proposition answers this question.

Proposition A $K	imes K$ matrix A is invertible if and only if it is full-rank.

Proof

Let us first prove the "if" part (full-rank implies invertibility). Denote by [eq6] the K columns of the $K	imes K$ identity matrix I. If A is full-rank, then its K columns are linearly independent. This implies that any K-dimensional vector can be written as a linear combination of the columns of A (see the lecture on standard bases for a proof). Therefore, for $k=1,ldots ,K$, we can write $e_{k}$ as a linear combination of the columns of A:[eq7]where [eq8] are the coefficients of the linear combination. By using the results presented in the lecture on matrix multiplication and linear combinations, these coefficients can be stacked to form a Kx1 vector [eq9]such that[eq10]Moreover, the column vectors $C_{ullet k}$ can be placed side by side to form a $K	imes K$ matrix [eq11]such that[eq12]Thus, A is invertible and [eq13]We are now going to prove the "only if" part (invertibility implies full-rank). If an inverse $A^{-1}$ exists, then[eq14]Post-multiplying both sides of the equation by any Kx1 vector $v$, we get[eq15]or[eq16]Thus, any vector $v$ can be written as a linear combination of the columns of A, with coefficients taken from $A^{-1}v$. In other words, the columns of A span the space of all Kx1 vectors. If they were not linearly independent, then we would be able to eliminate some of them and obtain a set of linearly independent vectors that 1) is a basis of the space $S$ of all Kx1 vectors; 2) has cardinality less than K. But this is not possible because any basis of $S$ has cardinality equal to K (see the lecture on standard bases). Therefore, the columns of A must be linearly independent, which means that A has full rank.

Singular matrix

A matrix that is not invertible is called a singular matrix. By the proposition above, a singular matrix is a matrix that does not have full rank. For this reason, a singular matrix is also sometimes called rank-deficient.

Uniqueness of the inverse

Proposition If the inverse of a $K	imes K$ matrix exists, then it is unique.

Proof

In the proof that a matrix A is invertible if and only if it is full-rank, we have shown that the inverse can be constructed column by column, by finding the vectors $C_{ullet k}$ that solve[eq10]that is, by writing the vectors of the canonical basis as linear combinations of the columns of A. Since the representation of a vector in terms of a basis is unique, the vectors $C_{ullet k}$ are unique. But the latter are the columns of $A^{-1}$. Therefore, also $A^{-1}$ is unique.

Right and left inverse

An important fact is that $A^{-1}$ gives the identity matrix not only when it is pre-multiplied, but also when it is post-multiplied by A.

Proposition Let A be a $K	imes K$ matrix. If its inverse $A^{-1}$ exists, it satisfies not only the condition[eq18]but also the condition[eq19]where I is the $K	imes K$ identity matrix.

Proof

Post-multiply both sides of the equation[eq20]by A, and obtain[eq21]or [eq22]But we also have that[eq23]Now, it might seem intuitive that equations (1) and (2) imply that [eq19]Nonetheless, it needs to be proved. The proof is as follows. Equation (1) says that the columns of A, on the right-hand side of the equation, can be seen as linear combinations of the columns of A itself, on the left-hand side, with coefficients taken from $A^{-1}A$. Equation (2) says that the columns of A, on the right-hand side of the equation, can be seen as linear combinations of the columns of A itself, on the left-hand side, with coefficients taken from I. Since A is full-rank, its columns are a basis of the space of all Kx1 vectors, and by the uniqueness of the representation in terms of a basis (see the lecture entitled Basis of a linear space), the coefficients of the linear combinations must be the same, that is,[eq19]

Inverse of a product

The following proposition holds.

Proposition Let A and $B$ be two $K	imes K$ matrices. Then, the product $AB$ is invertible if and only if A and $B$ are invertible. Furthermore,[eq26]

Proof

The two matrices A and $B$ are invertible if and only if they are full-rank (see above). If A and $B$ are full-rank, then $AB$ is full-rank (see the lecture on matrix products and rank). On the contrary, if at least one of the two matrices is not full-rank, then the rank of their product is less than K because[eq27]In other words, the product is full-rank only if A and $B$ are full-rank. Furthermore, it can be easily checked that $B^{-1}A^{-1}$ satisfies the definition of inverse of $AB$:[eq28]

Inverse of the transpose

The next proposition shows how to compute the inverse of the transpose of a matrix.

Proposition Let A be a $K	imes K$ matrix and $A^{	op }$ its transpose. If A is invertible, then $A^{	op }$ is invertible and [eq29]

Proof

We have that[eq30]By transposing both sides of the equation, we obtain[eq31]because the identity matrix is equal to its transpose. By using the formula for the transposition of a product, we get[eq32]So, [eq33] satisfies the definition of inverse of $A^{	op }$.

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Define[eq34]

Verify that[eq35]

Solution

We need to carry out the multiplication between the two matrices:[eq36]Their product is equal to the identity matrix, so $A^{-1}$ is indeed the inverse of A.

Exercise 2

Let A, $B$ and $C$ be $K	imes K$ full-rank matrices. Express the inverse[eq37]in terms of the inverses of A, $B$ and $C$.

Solution

We have to repeatedly apply the formula for the inverse of a product:[eq38]

Exercise 3

Show that[eq39]

Solution

By definition, the inverse of $A^{-1}$ needs to satisfy[eq40]But[eq19]As a consequence, [eq39]

How to cite

Please cite as:

Taboga, Marco (2021). "Inverse of a matrix", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/inverse-matrix.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.