A square matrix is said to have an LU decomposition (or LU factorization) if it can be written as the product of a lower triangular (L) and an upper triangular (U) matrix.
Not all square matrices have an LU decomposition, and it may be necessary to permute the rows of a matrix before obtaining its LU factorization.
We begin with a definition.
Definition
A
matrix
is said to have an LU decomposition if and only if there exist a
lower triangular
matrix
and an upper triangular
matrix
such that
Some simple examples of LU decompositions follow.
Example
The
matrix
has
the following LU
factorization:
Example
The
matrix
can
be written
as
We begin with a negative result about existence.
Proposition Not all square matrices have an LU factorization.
It is sufficient to provide a single
counter-example. Take the
invertible matrix
Suppose
has an LU factorization
with
factors
and
Compute
the
product
Now,
implies
which
in turn implies that at least one of
and
must be zero. As a consequence, at least one of
and
is not invertible (because
triangular matrices are
invertible only if their diagonal entries are non-zero). This is in
contradiction with the fact that
is invertible and, as a consequence,
and
must be invertible (see the proposition about the
invertibility of products).
Thus, we have proved by contradiction that
cannot have an LU decomposition.
The second negative result concerns uniqueness.
Proposition If a matrix has an LU decomposition, then it is not unique.
Suppose a
matrix
has an LU
decomposition
Take
any
diagonal matrix
whose diagonal entries are all non-zero. Then,
is invertible, its inverse
is also diagonal and we can
write
A
diagonal matrix
is lower triangular, and the
product of two lower triangular matrices is lower triangular. Therefore
is lower triangular. The inverse
,
being diagonal, is upper triangular. Since the product of two upper triangular
matrices is upper triangular, we have that
is upper triangular. Thus, by changing the matrix
,
we are able to get infinite factorizations of
into a lower triangular matrix
and an upper triangular matrix
.
We now provide some sufficient conditions for the existence of an LU decomposition of a matrix.
Proposition
Let
be a
matrix. If
can be reduced to row echelon
form by Gaussian
elimination without ever interchanging two rows, then
has an LU decomposition.
Gaussian elimination allows us to reduce
to a matrix
in row echelon form by means of elementary operations. If
elementary operations are performed, then we can
write
where
are elementary matrices. The
matrix
is upper triangular because a
square matrix in row echelon form is upper triangular. Since no row
interchanges are necessary, any matrix
is used to add a multiple of one row to a row below it. By writing down any
instance of such a matrix, it can be seen that all the matrices
are lower triangular and all the entries on their main diagonals are non-zero,
so that they are invertible. Since products and inverses of lower triangular
matrices are lower triangular, we can
write
where
the matrix
is
lower triangular.
We have proved that not all square matrices have an LU factorization. However, we can always permute the rows of a matrix (i.e., repeatedly interchange them) so as to get an LU factorization, as illustrated by the following proposition.
Proposition
Let
be a
matrix. Then, there exists a
permutation matrix
such that
has an LU
decomposition:
where
is a lower triangular
matrix and
is an upper triangular
matrix.
By performing
Gaussian elimination, we
can reduce
to a matrix
in row echelon form. Moreover, the elementary operations performed during
Gaussian elimination can be done by using
elementary matrices. If
elementary operations are performed in order to reduce
to row echelon form, then we can
write
where
are elementary matrices. The first thing to note is that
a square matrix in row echelon
form is upper triangular. Therefore,
is upper triangular. An elementary matrix
used
in Gaussian elimination can be either 1) a permutation matrix
used to interchange two rows or 2) a matrix
used to add a multiple of one row to a row below it. It is immediate to verify
that all the matrices
are lower triangular and all the entries on their main diagonals are non-zero,
so that they are invertible. Suppose that the first permutation matrix we
encounter is in position
,
so that we
have
Remember
that a permutation matrix is orthogonal, that is, its inverse is equal to its
transpose. Therefore,
and
we can write
or
where
for
.
As we explained in the lecture on
elementary matrices, a matrix
,
used to add
times the
-th
row to the
-th
row can be written as a rank one update to the identity
matrix:
where
is a matrix whose entries are all zeros except
.
In our case, that is, in the case of the Gaussian elimination algorithm, we
have that
.
Moreover,
We
need to analyze the permutations performed on the rows and columns of
by
and
.
If the multiplication
interchanges the
-th
and
-th
rows of
,
then we have that
because in the Gaussian elimination algorithm the permutation performed by
comes after the row addition performed by
.
If
and
,
the row interchange does not affect the non-zero element of
.
On the contrary, if
or
,
the non-zero element is moved to a different row (either the
-th
or the
-th),
but remains on the
-th
column. It also remains below the main diagonal because
.
The column interchange of the
-th
and
-th
columns of
performed by
does not affect the non-zero entry because the latter is in the
-th
column and
.
Thus, after the transformation
,
the only non-zero element of
remains on the same column and can change row but it remains below the main
diagonal. Therefore,
is lower triangular for
.
Furthermore,
is invertible because it is a product of invertible matrices. We can now
proceed: each time that one of the matrices
in the
equation
is
a permutation matrix, we we use it to permute all the lower triangular
matrices to its right (which remain triangular). In the end, we
get
where
is the set of indices of the triangular elementary matrices and
is the set of indices of the permutation matrices. We can
write
where
and
We
have that
is a permutation matrix because the product of permutation matrices is a
permutation matrix; moreover,
is lower triangular because the product and the inverse of lower triangular
matrices are lower triangular.
Reading the proof of the proposition above is highly recommended because it is a constructive proof: it shows how the LU decomposition can be computed by using the Gaussian elimination algorithm.
While we have shown how to guarantee the existence of the LU factorization, the problem of uniqueness remains to be addressed. We have shown above that the LU decomposition is not unique. However, by adding a constraint on one of the two triangular matrices, we can also achieve uniqueness. The constraint we impose is that one of the two triangular matrices be unit triangular (i.e., a triangular matrix whose diagonal entries are all equal to 1).
Proposition
Let
be a
matrix and
a
permutation matrix such that
has an LU decomposition. Then, if
is invertible, there are a unique
lower triangular matrix
having all diagonal entries equal to 1 and a unique
upper triangular matrix
such
that
The existence of an LU factorization
has
already been proved above. If the matrix
is not unit triangular, then we can
write
where
is a diagonal matrix whose diagonal is equal to the diagonal of
.
Note that the matrix
is well defined because
is invertible, which guarantees that
is invertible and the elements on its main diagonal are non-zero (more on this
point below).
Then,
is
unit lower triangular and
is
upper triangular. Thus, there exists a
factorization
satisfying
the properties stated in the proposition. Suppose there is another such
factorization
Then,
Note
that: 1)
and
are invertible because all their diagonal entries are non-zero; 2)
is invertible because any permutation matrix is; 3)
is invertible by assumption. Therefore,
is
invertible. Thus, we
have
We
know that the inverse of a lower (upper) triangular matrix is lower
(upper) triangular, and the product of two lower (upper) triangular matrices
is lower (upper) triangular. Therefore,
is lower triangular and
is upper triangular. The two matrices are equal, which is possible only if
they are diagonal matrices. Both
and
are unit triangular because the inverse of a unit triangular matrix is unit
triangular (as proved in the lecture on triangular matrices, the diagonal
entries of
are equal to the reciprocals of the corresponding diagonal entries of
).
Therefore, not only the product
is diagonal, but all its diagonal entries are equal to 1. In other
words,
which
implies
Furthermore,
which
implies
Thus,
and
are unique.
Note that the proposition above applies also to matrices that do not need to
be permuted to have an LU factorization (i.e., when
).
Please cite as:
Taboga, Marco (2021). "LU decomposition", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/lu-decomposition.
Most of the learning materials found on this website are now available in a traditional textbook format.