Search for probability and statistics terms on Statlect
StatLect
Index > Matrix algebra

LU decomposition

by , PhD

A square matrix is said to have an LU decomposition (or LU factorization) if it can be written as the product of a lower triangular (L) and an upper triangular (U) matrix.

Not all square matrices have an LU decomposition, and it may be necessary to permute the rows of a matrix before obtaining its LU factorization.

Table of Contents

Definition

We begin with a definition.

Definition A $K	imes K$ matrix A is said to have an LU decomposition if and only if there exist a lower triangular $K	imes K$ matrix $L$ and an upper triangular $K	imes K$ matrix $U$ such that [eq1]

Some simple examples of LU decompositions follow.

Example The $2	imes 2$ matrix[eq2]has the following LU factorization:[eq3]

Example The $3	imes 3$ matrix [eq4]can be written as[eq5]

Existence and uniqueness

We begin with a negative result about existence.

Proposition Not all square matrices have an LU factorization.

Proof

It is sufficient to provide a single counter-example. Take the $2	imes 2$ invertible matrix [eq6]Suppose A has an LU factorization [eq7]with factors[eq8]and[eq9]Compute the product[eq10]Now, [eq11] implies[eq12]which in turn implies that at least one of $L_{11}$ and $U_{11}$ must be zero. As a consequence, at least one of $L$ and $U$ is not invertible (because triangular matrices are invertible only if their diagonal entries are non-zero). This is in contradiction with the fact that A is invertible and, as a consequence, $L$ and $U$ must be invertible (see the proposition about the invertibility of products). Thus, we have proved by contradiction that A cannot have an LU decomposition.

The second negative result concerns uniqueness.

Proposition If a matrix has an LU decomposition, then it is not unique.

Proof

Suppose a $K	imes K$ matrix A has an LU decomposition[eq13]Take any $K	imes K$ diagonal matrix $D$ whose diagonal entries are all non-zero. Then, $D$ is invertible, its inverse $D^{-1}$ is also diagonal and we can write[eq14]A diagonal matrix $D$ is lower triangular, and the product of two lower triangular matrices is lower triangular. Therefore $LD$ is lower triangular. The inverse $D^{-1}$, being diagonal, is upper triangular. Since the product of two upper triangular matrices is upper triangular, we have that $D^{-1}U$ is upper triangular. Thus, by changing the matrix $D$, we are able to get infinite factorizations of A into a lower triangular matrix $LD$ and an upper triangular matrix $D^{-1}U$.

Sufficient conditions for existence

We now provide some sufficient conditions for the existence of an LU decomposition of a matrix.

Proposition Let A be a $K	imes K$ matrix. If A can be reduced to row echelon form by Gaussian elimination without ever interchanging two rows, then A has an LU decomposition.

Proof

Gaussian elimination allows to reduce A to a matrix $U$ in row echelon form by means of elementary operations. If n elementary operations are performed, then we can write[eq15]where [eq16] are elementary matrices. The matrix $U$ is upper triangular because a square matrix in row echelon form is upper triangular. Since no row interchanges are necessary, any matrix $E_{i}$ is used to add a multiple of one row to a row below it. By writing down any instance of such a matrix, it can be seen that all the matrices $E_{i}$ are lower triangular and all the entries on their main diagonals are non-zero, so that they are invertible. Since products and inverses of lower triangular matrices are lower triangular, we can write[eq17]where the matrix [eq18]is lower triangular.

PA = LU: how to always get an LU decomposition with permutations

We have proved that not all square matrices have an LU factorization. However, we can always permute the rows of a matrix (i.e., repeatedly interchange them) so as to get an LU factorization, as illustrated by the following proposition.

Proposition Let A be a $K	imes K$ matrix. Then, there exists a permutation matrix $P$ such that $PA$ has an LU decomposition:[eq19]where $L$ is a lower triangular $K	imes K$ matrix and $U$ is an upper triangular $K	imes K$ matrix.

Proof

By performing Gaussian elimination, we can reduce A to a matrix $U$ in row echelon form. Moreover, the elementary operations performed during Gaussian elimination can be done by using elementary matrices. If n elementary operations are performed in order to reduce A to row echelon form, then we can write[eq20]where [eq16] are elementary matrices. The first thing to note is that a square matrix in row echelon form is upper triangular. Therefore, $U$ is upper triangular. An elementary matrix $E_{i} $used in Gaussian elimination can be either 1) a permutation matrix $P_{i}$ used to interchange two rows or 2) a matrix $L_{i}$ used to add a multiple of one row to a row below it. It is immediate to verify that all the matrices $L_{i}$ are lower triangular and all the entries on their main diagonals are non-zero, so that they are invertible. Suppose that the first permutation matrix we encounter is in position $j$, so that we have[eq22]Remember that a permutation matrix is orthogonal, that is, its inverse is equal to its transpose. Therefore, [eq23]and we can write [eq24]or[eq25]where[eq26]for $i=1,ldots ,j-1$. As we explained in the lecture on elementary matrices, a matrix $L_{i}$, used to add $lpha $ times the $r$-th row to the $s$-th row can be written as a rank one update to the identity matrix:[eq27]where $M$ is a matrix whose entries are all zeros except $M_{sr}=$ $lpha $. In our case, that is, in the case of the Gaussian elimination algorithm, we have that $r<s$. Moreover,[eq28]We need to analyze the permutations performed on the rows and columns of $M$ by $P_{j}$ and $P_{j}^{	op }$. If the multiplication $P_{j}M$ interchanges the $t$-th and $u$-th rows of $M$, then we have that $r<t<u$ because in the Gaussian elimination algorithm the permutation performed by $P_{j}$ comes after the row addition performed by $L_{i}$. If $s
eq t$ and $s
eq u$, the row interchange does not affect the non-zero element of $M$. On the contrary, if $s=t$ or $s=u$, the non-zero element is moved to a different row (either the $u$-th or the $t$-th), but remains on the $r$-th column. It also remains below the main diagonal because $r<t<u$. The column interchange of the $t$-th and $u$-th columns of $P_{j}M$ performed by $P_{j}^{	op }$ does not affect the non-zero entry because the latter is in the $r$-th column and $r<t<u$. Thus, after the transformation [eq29], the only non-zero element of $M$ remains on the same column and can change row but it remains below the main diagonal. Therefore, $widetilde{L}_{i}$ is lower triangular for $i=1,ldots ,j-1$. Furthermore, $widetilde{L}_{i}$ is invertible because it is a product of invertible matrices. We can now proceed: each time that one of the matrices [eq30] in the equation[eq25]is a permutation matrix, we we use it to permute all the lower triangular matrices to its right (which remain triangular). In the end, we get[eq32]where $J_{1}$ is the set of indices of the triangular elementary matrices and $J_{2}$ is the set of indices of the permutation matrices. We can write[eq33]where[eq34]and[eq35]We have that $P$ is a permutation matrix because the product of permutation matrices is a permutation matrix; moreover, $L$ is lower triangular because the product and the inverse of lower triangular matrices are lower triangular.

Reading the proof of the proposition above is highly recommended because it is a constructive proof: it shows how the LU decomposition can be computed by using the Gaussian elimination algorithm.

How to make the LU and PA=LU decompositions unique

While we have shown how to guarantee the existence of the LU factorization, the problem of uniqueness remains to be addressed. We have shown above that the LU decomposition is not unique. However, by adding a constraint on one of the two triangular matrices, we can also achieve uniqueness. The constraint we impose is that one of the two triangular matrices be unit triangular (i.e., a triangular matrix whose diagonal entries are all equal to 1).

Proposition Let A be a $K	imes K$ matrix and $P$ a $K	imes K$ permutation matrix such that $PA$ has an LU decomposition. Then, if A is invertible, there are a unique $K	imes K$ lower triangular matrix $L$ having all diagonal entries equal to 1 and a unique $K	imes K$ upper triangular matrix $U$ such that[eq36]

Proof

The existence of an LU factorization [eq37]has already been proved above. If the matrix $L$ is not unit triangular, then we can write[eq38]where $D$ is a diagonal matrix whose diagonal is equal to the diagonal of $L$. Note that the matrix $D^{-1}$ is well defined because A is invertible, which guarantees that $L$ is invertible and the elements on its main diagonal are non-zero (more on this point below). Then,[eq39]is unit lower triangular and [eq40]is upper triangular. Thus, there exists a factorization[eq41]satisfying the properties stated in the proposition. Suppose there is another such factorization[eq42]Then,[eq43]Note that: 1) $L_{1}$ and $L_{2}$ are invertible because all their diagonal entries are non-zero; 2) $P$ is invertible because any permutation matrix is; 3) A is invertible by assumption. Therefore, [eq44]is invertible. Thus, we have[eq45] We know that the inverse of a lower (upper) triangular matrix is lower (upper) triangular, and the product of two lower (upper) triangular matrices is lower (upper) triangular. Therefore, $L_{2}^{-1}L_{1}$ is lower triangular and $U_{2}U_{1}^{-1}$ is upper triangular. The two matrices are equal, which is possible only if they are diagonal matrices. Both $L_{1}$ and $L_{2}^{-1}$ are unit triangular because the inverse of a unit triangular matrix is unit triangular (as proved in the lecture on triangular matrices, the diagonal entries of $L_{2}^{-1}$ are equal to the reciprocals of the corresponding diagonal entries of $L_{2}$). Therefore, not only the product $L_{2}^{-1}L_{1}$ is diagonal, but all its diagonal entries are equal to 1. In other words,[eq46]which implies[eq47]Furthermore,[eq48]which implies[eq49]Thus, $L_{1}$ and $U_{1}$ are unique.

Note that the proposition above applies also to matrices that do not need to be permuted to have an LU factorization (i.e., when $P=I$).

The book

Most of the learning materials found on this website are now available in a traditional textbook format.