Search for probability and statistics terms on Statlect
StatLect

Linear spaces

by , PhD

Linear spaces (or vector spaces) are sets that are closed with respect to linear combinations.

In other words, a given set $S$ is a linear space if its elements can be multiplied by scalars and added together, and the results of these algebraic operations are elements that still belong to $S$.

Table of Contents

A first informal and somewhat restrictive definition

Linear spaces are defined in a formal and very general way by enumerating the properties that the two algebraic operations performed on the elements of the spaces (addition and multiplication by scalars) need to satisfy.

In order to gradually build some intuition, we start with a narrower approach, and we limit our attention to sets whose elements are matrices (or column and row vectors).

Furthermore, we do not formally enumerate the properties of addition and multiplication by scalars because these have already been derived in previous lectures (see Matrix addition and Multiplication of a matrix by a scalar).

After this informal presentation, we report a fully general and rigorous definition of vector space.

Definition Let $S$ be a set of matrices such that all the matrices in $S$ have the same dimension. $S$ is a linear space if and only if, for any two matrices A and $B$ belonging to $S$ and any two scalars $lpha $ and $eta $, the linear combination[eq1]also belongs to $S$.

In other words, when $S$ is a linear space, if you take any two matrices belonging to $S$, you multiply each of them by a scalar, and you add together the products thus obtained, then you have a linear combination, which is also a matrix belonging to $S$.

Example Let $S$ be the set of all $2	imes 1$ column vectors whose entries are real numbers. Consider two vectors A and $B$ belonging to $S$. Denote by $A_{1}$ and $A_{2}$ the two entries of A, and by $B_{1}$ and $B_{2}$ the two entries of $B$. A linear combination of A and $B$ having two real numbers $lpha $ and $eta $ as coefficients can be written as[eq2]But [eq3] and [eq4] are real numbers because products and sums of real numbers are also real numbers. Therefore, the two entries of the vector[eq5]are real numbers, which implies that the vector belongs to $S$. Since this is true for any couple of coefficients $lpha $ and $eta $, $S$ is a linear space.

Fields

Before giving a rigorous definition of a vector space, we need to introduce fields, which are the sets of scalars used in the multiplication of vectors by scalars.

Definition Let F be a set, together with two binary operations [eq6], the addition, denoted by $+$ and the multiplication, denoted by $cdot $. The set F is said to be a field if and only if, for any $a,b$,$cin F$, the following properties hold:

  • Associativity of addition: [eq7]

  • Commutativity of addition: $a+b=b+a$

  • Additive identity: there exists an element of F, denoted by 0, such that $a+0=a$

  • Additive inverse: for each a, there exists an element of F, denoted by $-a$, such that [eq8]

  • Associativity of multiplication: [eq9]

  • Commutativity of multiplication: $acdot b=bcdot a$

  • Multiplicative identity: there exists an element of F, denoted by 1, such that $1cdot a=a$

  • Multiplicative inverse: for each $a
eq 0$, there exists an element of F, denoted by $a^{-1}$, such that $a^{-1}cdot a=1$

  • Distributive property: [eq10]

As you can see, these are the usual properties satisfied by the addition and multiplication of real numbers, which we studied when we were in school. They are also satisfied by the addition and multiplication of complex numbers.

In other words, both R and $U{2102} $, equipped with their usual operations, are fields. These are also the only two fields you will encounter in these lectures.

Nonetheless, the abstract definition is useful because it allows us to derive results that are valid for fields in general and that can be applied, when needed, both to R and to $U{2102} $.

The rigorous definition

We are now ready to define vector spaces.

Definition Let F be a field and let $S$ be a set equipped with an operation [eq11], called vector addition and denoted by $+$, and another operation [eq12], called scalar multiplication and denoted by $cdot $. The set $S$ is said to be a linear space (or vector space) over F if and only if, for any $s,t,uin S$ and any $a,bin F$, the following properties hold:

  • Associativity of vector addition: [eq13]

  • Commutativity of vector addition: $s+t=t+s$

  • Additive identity: there exists a vector [eq14], such that $s+0=s$

  • Additive inverse: for each $s$, there exists an element of $S$, denoted by $-s$, such that [eq15]

  • Compatibility of multiplications: [eq16]

  • Multiplicative identity: if $1in F$ is the multiplicative identity in F, then $1cdot s=s$

  • Distributive property w.r.t. vector addition: [eq17]

  • Distributive property w.r.t. field addition: [eq18]

The elements of a vector space are called vectors and those of its associated field are called scalars.

Note that, in the definition above, when we write [eq19] and [eq20], we mean that the two operations are defined on all of F and $S$ and always give results in $S$.

Thus, we are implicitly assuming that[eq21]which is equivalent to the requirement of closure with respect to linear combinations made in our previous informal definition of vector space.

Also note also that we have used the same symbols ($+$ and $cdot $) for the operations defined on the field F and for those that equip the vector space. Which is which is always clear from the context.

As usual, the symbol $cdot $ can be omitted, both in the context of fields and in that of vector spaces. So, $st$ has the same meaning as $scdot t$.

Moreover, the addition sign $+$ can be omitted when it is followed by the minus sign of an additive inverse. For example, $s-t$ has the same meaning as [eq22].

How the informal and the formal definition speak to each other

You can easily verify that any set of matrices (or column or row vectors) equipped with the two operations of matrix addition and multiplication of a matrix by a scalar satisfies all of the above properties, provided that the set is closed with respect to linear combinations.

Example Let $S$ be the space of all Kx1 column vectors having real entries. The addition of two column vectors is defined in the usual manner and any real number can be used to perform the multiplication of vectors by scalars. Stated differently, R is the field of scalars. In the lectures on matrix addition and multiplication of a matrix by a scalar we have proved that the various associative, commutative and distributive properties listed above hold. The zero vector that satisfies the additive identity property is a Kx1 vector whose entries are all equal to zero. By taking a linear combination of two vectors A,$Bin S$ with scalar coefficients [eq23], we obtain another Kx1 vector[eq5]whose k-th entry is[eq25]where $A_{k}$ and $B_{k}$ denote the k-th entries of A and $B$. Because products and sums of real numbers are also real numbers, [eq26] is a real number. This is true for any k. So, $lpha A+eta B$ is a Kx1 column vector whose entries are all real numbers. But this means that $lpha A+eta B$ belongs to $S$. Thus, $S$ is closed with respect to linear combinations. Hence it is a linear space.

In other words, the informal and somewhat restrictive definition of vector space that we have provided at the beginning of this lecture is perfectly compatible with the more formal and broader definition given in this section.

Moreover, the first informal definition uses the term "scalars" without specifying the field over which the vector space is defined: the omission is intentional, as the vast majority of results presented in these lectures apply both to linear spaces over the real field R and to spaces on $U{2102} $.

Example Up to now we have always dealt with real matrices, that is, matrices and vectors whose entries are real numbers. However, everything we have said applies also to complex matrices, that is, matrices whose entries are complex numbers. If we review all the definitions given in previous lectures, we will realize that nowhere we have specified that matrices must have real entries. An important difference is that, in the complex case, multiplication by scalars involves complex scalars, but everything else is a straightforward modification of the real case. For instance, we can take the previous example and replace 1) Kx1 column vectors having real entries with Kx1 column vectors having complex entries; 2) the field of scalars R with the field $U{2102} $. We can leave everything else unchanged and we have a proof of the fact that the space $S$ of all Kx1 column vectors having complex entries is a vector space over $U{2102} $.

In the lecture on coordinate vectors, we will also show that the informal definition is much less restrictive than it seems: all the elements of a finite-dimensional vector space can be written as arrays of numbers, so that, in a sense, every finite-dimensional vector space fits the informal definition.

Example: polynomials

Let's now see an example of vector space that is not directly covered by the more restrictive definition, but is covered by the general definition we have just introduced.

Example A third-order polynomial is a function[eq27]where the coefficients [eq28] and the argument $z$ are scalars belonging to a field F. Consider the space $P$ of all third-order polynomials. Let us consider the addition of two polynomials, $pleft( z
ight) $ defined above and $qleft( z
ight) $ defined as follows: [eq29]The natural way to add them is:[eq30]Moreover, multiplication of a polynomial $pleft( z
ight) $ by a scalar a is performed as follows:[eq31]It is easy to verify that $P$ is a vector space over F when it is equipped with the two operations of addition and multiplication by a scalar that we have just defined. Importantly, the additive identity property is satisfied by a polynomial whose coefficients are all equal to zero.

Linear subspace

An important concept is that of a linear subspace.

Definition Let $S$ be a linear space and $S_{1}$ a subset of $S$. $S_{1}$ is a linear subspace of $S$ if and only if $S_{1}$ is itself a linear space, that is, if and only if, for any two vectors $A,Bin S_{1}$ and any two scalars $lpha $ and $eta $, the linear combination[eq5]also belongs to $S_{1}$.

The following is a simple example of a linear subspace.

Example Let $S$ be the set of all $2	imes 1$ column vectors whose entries are real numbers. We already know that $S$ is a linear space. Let $S_{1}$ be the subset of $S$ composed of all the elements of $S$ whose first entry is equal to 0. Consider two vectors A and $B$ belonging to the subset $S_{1}$. Denote by $A_{1}$ and $A_{2}$ the two entries of A, and by $B_{1}$ and $B_{2}$ the two entries of $B$. By the definition of $S_{1}$, we have that $A_{1}=0$ and $B_{1}=0$. Therefore, a linear combination of A and $B$ having two real numbers $lpha $ and $eta $ as coefficients can be written as[eq33]Thus, the result of this linear combination is a vector whose first entry is equal to 0 and whose second entry is a real number (because products and sums of real numbers are also real numbers). Therefore, the vector[eq5]also belongs to $S_{1}$. Since this is true for any couple of coefficients $lpha $ and $eta $, $S_{1}$ is itself a linear space, and hence a linear subspace of $S$.

More than two vectors in the linear combination

A perhaps obvious fact is that linear spaces and subspaces are closed with respect to linear combinations of more than two vectors, as illustrated by the following proposition.

Proposition If $S$ is a linear (sub)space, then, for any n vectors [eq35] belonging to $S$ and any n scalars [eq36], the linear combination[eq37]also belongs to $S$.

Proof

By assumption, closure with respect to linear combinations holds for $n=2$. We only need to prove that it holds for a generic n, given that it holds for $n-1$. In other words, we need to prove that [eq38]implies[eq39]Let us define[eq40]We have just observed that $Bin S$. Now, we can write[eq41]But $B+lpha _{n}A_{n}$ is a linear combination of $B$ and $A_{n}$ (both belonging to $S$) with coefficients 1 and $lpha _{n}$. Therefore, $B+lpha _{n}A_{n}$ also belongs to $S$, which is what we needed to prove.

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Let $S$ be the set of all $2	imes 1$ column vectors whose entries are real numbers.

Let $S_{1}$ be the subset of $S$ composed of all the elements of $S$ whose first entry is twice the second entry.

Show that $S_{1}$ is a linear subspace of $S$.

Solution

From previous examples we know that $S$ is a linear space. Now, take any two vectors A and $B$ belonging to the subset $S_{1}$. Denote by $A_{1}$ and $A_{2}$ the two entries of A, and by $B_{1}$ and $B_{2}$ the two entries of $B$. By the definition of $S_{1}$, we have that [eq42]and [eq43]A linear combination of A and $B$ with coefficients $lpha $ and $eta $ can be written as[eq44]Thus, a linear combination of vectors belonging to $S_{1}$ gives as a result a vector whose second entry is a real number ([eq4] is a real number, because products and sums of real numbers are also real numbers) and whose first entry is twice the second entry. Therefore, the vector resulting from the linear combination also belongs to $S_{1}$. This is true for any couple of coefficients $lpha $ and $eta $. As a consequence, $S_{1}$ is itself a linear space, and hence a linear subspace of $S$.

Exercise 2

Let A be a $2	imes 2$ matrix. Let $S$ be the set of all $2	imes 1$ vectors x that satisfy the equation[eq46]

Show that $S$ is a linear space.

Solution

Consider a linear combination of two vectors $x_{1}$ and $x_{2}$ belonging to $S$ with coefficients $lpha _{1}$ and $lpha _{2}$:[eq47]By the distributive property of matrix multiplication, the product of A and this linear combination can be written as[eq48]Because $x_{1}$ and $x_{2}$ belong to $S$, we have that[eq49]As a consequence,[eq50]Thus, also the linear combination [eq51] belongs to $S$, because it satisfies the equation that all vectors of $S$ need to satisfy. This is true for any couple of vectors $x_{1}$ and $x_{2}$ and for any couple of coefficients $lpha _{1}$ and $lpha _{2}$, which implies that $S$ is a linear space.

Exercise 3

Let $S$ be the set of all $3	imes 1$ real column vectors.

Let $S_{1}$ be the set of all the elements of $S$ whose first entry is equal to 0 and whose second entry is equal to 1.

Verify whether $S_{1}$ is a linear subspace of $S$.

Solution

Consider two vectors A and $B$ belonging to the subset $S_{1}$. Denote by $A_{1}$, $A_{2}$ and $A_{3}$ the three entries of A, and by $B_{1}$, $B_{2}$ and $B_{3}$ the three entries of $B$. By the definition of $S_{1}$, we have that $A_{1}=0$, $A_{2}=1$, $B_{1}=0$ and $B_{2}=1$. A linear combination of A and $B$ with coefficients $lpha $ and $eta $ can be written as[eq52]The second entry of the linear combination ($lpha +eta $) is not necessarily equal to 1. Therefore, the vector[eq5]does not belong to $S_{1}$ for some coefficients $lpha $ and $eta $. Therefore, $S_{1}$ is not a linear subspace of $S$.

How to cite

Please cite as:

Taboga, Marco (2021). "Linear spaces", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/linear-spaces.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.