The orthogonal projection of a vector onto a given subspace is the vector that is closest to .
Before explaining orthogonal projections, we are going to revise some important concepts.
Let be a vector space. Remember that two vectors and belonging to are orthogonal when their inner product is zero:
Let be a subspace of . The orthogonal complement of , denoted by , is the unique subspace satisfying
The two subspaces and are complementary subspaces, which means thatwhere denotes a direct sum. By the properties of direct sums, any vector can be uniquely written aswhere and .
We can now define orthogonal projections.
Definition Let be a linear space. Let be a subspace of and its orthogonal complement. Let with its unique decompositionin which and . Then, the vector is called the orthogonal projection of onto and it is denoted by .
Thus, the orthogonal projection is a special case of the so-called oblique projection, which is defined as above, but without the requirement that the complementary subspace of be an orthogonal complement.
Example Let be the space of column vectors. DefineIts orthogonal complement isas we can easily verify by checking that the vector spanning is orthogonal to the two vectors spanning . Now, consider the vectorThen,
The distance between two vectors is measured by the norm of their difference.
It turns out that is the vector of that is closest to .
Proposition Let be a finite-dimensional vector space. Let be a subspace of . Then, for any .
Sincewhere , the vector belongs to and, as a consequence, is orthogonal to any vector belonging to , including the vector . Therefore, where in step we have used Pythagoras' theorem. By taking the square root of both sides, we obtain the stated result.
Suppose that is the space of complex vectors and is a subspace of .
By the results demonstrated in the lecture on projection matrices (that are valid for oblique projections and, hence, for the special case of orthogonal projections), there exists a projection matrix such thatfor any .
The projection matrix iswhere:
is any matrix whose columns form a basis for ;
is any matrix whose columns form a basis for .
In the case of orthogonal projections, the formula above becomes simpler.
Proposition Let be the space of complex vectors. Let be a subspace of . Let be a matrix whose columns form a basis for . Denote by the conjugate transpose of . Then, the matrix is the projection matrix such that for any .
We choose the columns of in such a way that they form an orthonormal basis for . As a consequence, as explained in the lecture on unitary matrices (see the section on non-square matrices with orthonormal columns), we havewhere denotes the conjugate transpose of . Moreover, since the columns of are orthogonal to the columns of , we haveandThe columns of are linearly independent since they form a basis. Hence, for any , which implies that for any . Thus, is full-rank (hence invertible). We use these results to derive the following equality: which implies, by the definition of inverse matrix, thatThus,
When we confine our attention to real vectors, conjugate transposition becomes simple transposition and the formula for the projection matrix becomeswhich might be familiar to those of us that have previously dealt with linear regressions and the OLS estimator.
When the columns of the matrix are orthonormal, we have a further simplification: and
Denote by the columns of .
Then, for any , we havewhich is the formula for projections on orthonormal sets that we have already encountered in the lectures on the Gram-Schmidt process and on the QR decomposition.
Please cite as:
Taboga, Marco (2021). "Orthogonal projection", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/orthogonal-projection.
Most of the learning materials found on this website are now available in a traditional textbook format.