Search for probability and statistics terms on Statlect
StatLect

Matrix algebra

This is a set of lecture notes on matrix algebra. Use these lectures for self-study or as a complement to your textbook.

The algebra of numeric arrays

Matrix addition

How to add two matrices together, definition and properties of addition

Vectors and matrices

Matrices, their characteristics, introduction to some special matrices

Linear combinations

Obtained by multiplying matrices by scalars, and by adding them together

Multiplication of a matrix by a scalar

How to multiply a matrix by a scalar, definition and properties of scalar multiplication

Identity matrix

It plays in matrix multiplication the same role played by 1 in the multiplication of numbers

Matrix multiplication

How to multiply two matrices, definition and properties of multiplication

Properties of block matrices

Addition, scalar multiplication and multiplication of block matrices can be performed on their blocks

Block matrix

A matrix that has been partitioned into smaller submatrices

Learn how to multiply two matrices.

Linear spaces

Linear independence

This lecture introduces one of the central concepts in matrix algebra

Linear spaces

Sets of vectors that are closed with respect to taking linear combinations

Basis of a linear space

A set of linearly independent vectors that span the linear space

Linear span

The linear space generated by taking linear combinations of a set of vectors

Standard basis

A basis made up of vectors that have all entries equal to zero except one

Dimension of a linear space

The number of elements of any one of the bases of the linear space

Complementary subspace

Two subspaces are complementary if their direct sum equals the whole space

Direct sum

The sum of two subspaces whose intersection contains only the zero vector

The standard basis of the space of three-dimensional vectors.

Matrix rank and inversion

Rank of a matrix

The dimension of the linear space spanned by the columns or rows of the matrix

Matrix product and linear combinations

Multiplying matrices is equivalent to taking linear combinations of their rows and columns

Inverse of a matrix

Multivariate generalization of the concept of reciprocal of a number

Matrix product and rank

This lecture note presents some useful facts about the rank of the product of two matrices

Schur complement

A device that helps to invert and factorize block matrices

Matrix inversion lemmas

Formulae for computing how changes in a matrix affect its inverse

Rank-one updates, Sherman-Morrison formula, Woodbury matrix identity.

Linear maps

Linear maps

Functions that preserve vector addition and scalar multiplication

Coordinate vectors

Vectors containing the coefficients of the representation in terms of a basis

Linear operators

Linear transformations that map a space into itself

Matrix of a linear map

Each linear map is associated to a unique matrix that transforms coordinates

Kernel of a linear map

The set of vectors belonging to the domain that are mapped into the zero vector

Composition of linear maps

The composition of two linear transformations is itself linear

Surjective, injective and bijective maps

Learn how to classify maps based on their kernel and range

Range of a linear map

The subset of the codomain formed by all the values taken by the map

Change of basis

Learn what happens to coordinate vectors when you switch to a different basis

Rank-nullity theorem

The dimension of the domain of a linear map equals the sum of the dimensions of its kernel and range

Projection matrix

The matrix of a linear operator which projects vectors onto a subspace

Any vector in a linear space can be represented as a coordinate vector with respect to a basis.

Systems of linear equations

Equivalent systems of equations

Systems of linear equations having the same set of solutions

Matrices and linear systems

Systems of linear equations can be written compactly and easily studied with matrices

Augmented matrix

A compact way to represent systems of linear equations

Elementary row operations

Elementary operations used in matrix algebra to transform a linear system into an equivalent system

Gaussian elimination

The main algorithm used to reduce linear systems to row echelon form

Row echelon form

Systems of linear equations having this form can be easily solved with the back-substitution algorithm

Gauss-Jordan elimination

The standard algorithm used to transform linear systems to reduced row echelon form

Reduced row echelon form

Echelon form in which the basic columns are vectors of the standard basis

Non-homogeneous system

A system of equations in which the vector of constants is non-zero

Homogeneous system

A system of equations in which the vector of constants is zero

Elementary column operations

Operations that allow us to transform a linear system arranged horizontally into an equivalent system

Discover the differences between the Gaussian and Gauss Jordan elimination algorithms.

Special matrices and equivalence

Triangular matrix

A matrix that has all entries below (or above) the main diagonal equal to zero

Permutation matrix

A matrix used to perform multiple interchanges of rows and columns

Diagonal matrix

A matrix whose off-diagonal entries are all equal to zero

Elementary matrix

A matrix obtained by performing an elementary operation on an identity matrix

LU decomposition

How to write a matrix as a product of a lower and an upper triangular matrix

Row equivalence

How elementary row operations generate equivalence classes

A triangular matrix is invertible if and only if ...

Complex vectors and inner products

Conjugate transpose

Taking both the transpose and the complex conjugate of a matrix is very common in matrix algebra

Complex vectors and matrices

Basic facts and definitions about matrices whose entries are complex numbers

Vector norm

The norm of a vector generalizes the concept of length to abstract spaces

Inner product

A generalization of the concept of dot product to abstract vector spaces

Gram Schmidt process

A procedure used in matrix algebra to create sets of orthonormal vectors

Orthonormal basis

A basis whose vectors are orthogonal and have unit norm

QR decomposition

A=QR where Q has orthonormal columns and R is upper triangular

Unitary matrix

A complex matrix whose columns form an orthonormal set

Orthogonal projection

A special case of oblique projection that gives the closest vector in the subspace

Orthogonal complement

The subspace formed by all the vectors that are orthogonal to a given set

Householder matrix

A unitary matrix often used to transform another matrix into a simpler one

Four fundamental subspaces

The ranges and kernels of a matrix and its transpose are pairwise orthogonal complements

Givens rotation matrix

An orthogonal matrix that can be used to perform equivalent transformations

When the entries of a unitary matrix are all real, then it is called an orthogonal matrix.

Determinants

Determinant of a matrix

A number telling us how the associated linear transformation scales volumes

Sign of a permutation

A concept that pops up in the definition of determinant of a matrix

Properties of the determinant

Discover several properties enjoyed by the determinant of a matrix

Determinants of elementary matrices

Determinants of elementary matrices enjoy some special properties

Laplace expansion, minors and cofactors

A formula for easily computing the determinant of a matrix

Determinant of a block matrix

Rules about the determinants of block matrices are very useful

Trace of a matrix

The trace of a matrix is the sum of the entries on its main diagonal

The determinant of a linear transformation tells us how the transormation affects areas and volumes.

Polynomials

Division of polynomials

Dividend equals divisor times quotient plus remainder, achieved with the Division Algorithm

Polynomials

These lecture notes summarize some facts about polynomials that are important in matrix algebra

Polynomial gcd

The greatest common divisor of polynomials has properties similar to the gcd of integers

Learn how to perform the division of two polynomials using a tableau.

Eigenvalues and eigenvectors

Characteristic polynomial

The polynomial whose roots are the eigenvalues of a matrix

Eigenvalues and eigenvectors

Linear transformations scale up or down the sides of certain parallelograms but do not change their angles

Linear independence of eigenvectors

Eigenvectors corresponding to distinct eigenvalues are linearly independent

Algebraic and geometric multiplicities

The multiplicity of a repeated eigenvalue and the dimension of its eigenspace

Similar matrix

Similar matrices have the same rank, trace, determinant and eigenvalues

Properties of eigenvalues

Eigenvalues and eigenvectors possess several useful properties that are also easy to derive

Schur decomposition

Any matrix is unitarily similar to an upper triangular matrix

Matrix diagonalization

Transformation of a matrix into another similar matrix that is diagonal

Positive definite matrix

A full-rank matrix whose eigenvalues are all strictly positive

Normal matrix

A matrix that commutes with its conjugate transpose and is unitarily diagonalizable

Singular value decomposition

Write a matrix as a product of a unitary, a diagonal and another unitary matrix

Cholesky decomposition

This lecture explains how to factorize a matrix into a lower triangular matrix and its conjugate transpose

Invariant subspace

A subspace that is mapped into itself by a linear operator

Decomposition Conditions on A Properties of matrices
A = LU No row interchanges for REF L lower triangular, U upper triangular
PA = LU No conditions P permutation, L and U lower and upper triangular
A = QR Full-rank Q unitary, R upper with diagonal entries > 0
A = PDP-1 (Diagonalization) No defective eigenvalues D diagonal, P invertible
A = QTQ* (Schur) No conditions T upper triangular, Q unitary
A = LL* (Choleski) Positive definite L lower triangular with diagonal entries > 0
A = USV* (Singular Value) No conditions U and V unitary, S diagonal with entries >= 0
A = PJP-1 (Jordan) No conditions J in Jordan form, P invertible

Matrix polynomials

Range null-space decomposition

A certain power of a matrix can be used to decompose a space of vectors

Matrix power

Discover what happens to the null and column spaces of a matrix when you raise it to integer powers

Cayley-Hamilton theorem

If you transform the characteristic polynomial into a matrix polynomial, you get the zero matrix

Matrix polynomial

Matrix powers can be used to construct polynomials, similarly to the scalar case

Primary Decomposition Theorem

These lecture notes explain the most important application of the minimal polynomial

Minimal polynomial

The annihilating polynomial having the lowest possible degree

Cyclic subspace

Nilpotent matrices generate strings of linearly independent vectors

Nilpotent matrix

A matrix that becomes equal to the zero matrix if raised to a sufficiently high power

Discover a generalization of the concept of an eigenvector.

Jordan form

Jordan chain

A string of generalized eigenvectors ending with an ordinary eigenvector

Generalized eigenvector

A vector that can be used to complete a basis of eigenvectors when the matrix is defective

Jordan form

Any matrix is similar to an almost diagonal matrix, said to be in Jordan form

A first course may stop here.

Kronecker products and vectorizations

Properties of Kronecker products

This lecture note presents several uselful properties of the Kronecker product

Kronecker product

A big matrix that contains all the products of the entries of two matrices

Commutation matrix

A permutation matrix used to transpose vectorizations and commute Kronecker products

Vec operator

An operator that transforms any matrix into a column vector

Matrix functions

Matrix function

How to apply scalar functions such as the exponential to square matrices

Applications

Discrete Fourier Transform

A workhorse of engineering and a useful application of matrix algebra

The books

Most of the learning materials found on this website are now available in a traditional textbook format.