StatLect
Index > Asymptotic theory

Sequences of random vectors and their convergence

In this lecture, we generalize the concepts introduced in the lecture entitled Sequences of random variables and their convergence$.$ We no longer consider sequences whose elements are random variables, but we now consider sequences [eq1] whose generic element X_n is a Kx1 random vector. The generalization is straightforward, as the terminology and the basic concepts are almost the same used for sequences of random variables.

Table of Contents

Terminology

Realization of a sequence

Let [eq2] be a sequence of Kx1 real vectors and [eq3] a sequence of Kx1 random vectors. If the real vector $x_{n}$ is a realization of the random vector X_n for every n, then we say that the sequence of real vectors [eq4] is a realization of the sequence of random vectors [eq5] and we write[eq6]

Sequences on a sample space

Let Omega be a sample space. Let [eq5] be a sequence of random vectors. We say that [eq8] is a sequence of random vectors defined on the sample space Omega if all the random vectors X_n belonging to the sequence [eq5] are functions from Omega to $U{211d} ^{K}$.

Independent sequences

Let [eq5] be a sequence of random vectors defined on a sample space Omega. We say that [eq5] is an independent sequence of random vectors (or a sequence of independent random vectors) if every finite subset of [eq5] (i.e. every finite subset of random vectors belonging to the sequence) is a set of mutually independent random vectors.

Identically distributed sequences

Let [eq5] be a sequence of random vectors. Denote by [eq14] the joint distribution function of a generic element of the sequence X_n. We say that [eq15] is a sequence of identically distributed random vectors if any two elements of the sequence have the same joint distribution function:[eq16]

IID sequences

Let [eq5] be a sequence of random vectors defined on a sample space Omega. We say that [eq5] is a sequence of independent and identically distributed random vectors (or an IID sequence of random vectors), if [eq5] is both a sequence of independent random vectors and a sequence of identically distributed random vectors.

Stationary sequences

Let [eq5] be a sequence of random vectors defined on a sample space Omega. Take a first group of $q$ successive terms of the sequence $X_{n+1}$, ..., $X_{n+q}$. Now take a second group of $q$ successive terms of the sequence $X_{n+k+1}$, ..., $X_{n+k+q}$. The second group is located k positions after the first group. Denote the joint distribution function of the first group of terms by[eq21]and the joint distribution function of the second group of terms by[eq22]

The sequence [eq5] is said to be stationary (or strictly stationary) if and only if[eq24]for any $n,k,qin U{2115} $ and for any vector [eq25].

In other words, a sequence is strictly stationary if and only if the two random vectors [eq26] and [eq27] have the same distribution (for any n, k and $q$). Requiring strict stationarity is weaker than requiring that a sequence be IID (see IID sequences above): if [eq5] is an IID sequence, then it is also strictly stationary, while the converse is not necessarily true.

Weakly stationary sequences

Let [eq5] be a sequence of random vectors defined on a sample space Omega. We say that [eq5] is a covariance stationary sequence (or weakly stationary sequence) if[eq31]where n and $j$ are, of course, integers. Property (1) means that all the random vectors belonging to the sequence [eq5] have the same mean. Property (2) means that the cross-covariance between a term X_n of the sequence and the term that is located $j$ positions before it ($X_{n-j}$) is always the same, irrespective of how X_n has been chosen. In other words, [eq33] depends only on $j$ and not on n. Note also that property (2) implies that all the random vectors in the sequence have the same covariance matrix (because [eq34]):[eq35]

Mixing sequences

The definition of mixing sequence of random vectors is a straightforward generalization of the definition of mixing sequence of random variables, which has been discussed in the lecture entitled Sequences of random variables and their convergence. Therefore, we report here the definition of mixing sequence of random vectors without further comments and we refer the reader to the aforementioned lecture for an explanation of the concept of mixing sequence.

Definition We say that a sequence of random vectors [eq5] is mixing (or strongly mixing) if and only if[eq37]for any two functions $f$ and $g$ and for any n and $q$.

Ergodic sequences

As in the previous section, we report here a definition of ergodic sequence of random vectors, which is a straightforward generalization of the definition of ergodic sequence of random variables, and we refer the reader to the lecture entitled Sequences of random variables and their convergence for explanations of the concept of ergodicity.

Denote by [eq38] the set of all possible sequences of real Kx1 vectors. When [eq39] is a sequence of real vectors, denote by [eq40] the subsequence obtained by dropping the first term of [eq41], that is,[eq42]

We say that a subset [eq43] is a shift invariant set if and only if [eq44] belongs to A whenever [eq41] belongs to A.

Definition A set [eq43] is shift invariant if and only if[eq47]

Shift invariance is used to define ergodicity.

Definition A sequence of random vectors [eq5] is said to be an ergodic sequence if an olny if[eq49]whenever A is a shift invariant set.

Convergence

Similarly to what happens for sequences of random variables, there are several different notions of convergence also for sequences of random vectors. In particular, all the modes of convergence found for random variables can be generalized to random vectors:

  1. Pointwise convergence

  2. Almost sure convergence

  3. Convergence in probability

  4. Mean-square convergence

  5. Convergence in distribution

The book

Most of the learning materials found on this website are now available in a traditional textbook format.