This lecture discusses convergence in probability, first for sequences of random variables, and then for sequences of random vectors.
Table of contents
As we have discussed in the lecture on Sequences of random variables and their convergence, different concepts of convergence are based on different ways of measuring the distance between two random variables (how "close to each other" two random variables are).
The concept of convergence in probability is based on the following intuition: two random variables are "close to each other" if there is a high probability that their difference is very small.
Let
be a sequence of random variables defined on
a sample space
.
Take a random variable
and a strictly positive number
.
Suppose that we consider
far from
when
Then, the
probabilityis
the probability that
is far from
.
If
converges to
,
the probability that
and
are far from each other should become smaller and smaller as
increases.
In other words, we should
have
Note that
is
a sequence of real numbers. Therefore, the limit in equation (1) is the usual
limit of a sequence of real numbers.
We would like to be very restrictive on our criterion for deciding whether
is far from
.
As a consequence, condition (1) should be satisfied for any, arbitrarily
small,
.
The intuitive considerations above lead us to the following definition of convergence.
Definition
Let
be a sequence of random variables defined on a sample space
.
We say that
is convergent in probability to a random variable
defined on
if and only
if
for
any
.
The variable
is called the probability limit of the sequence and
convergence is indicated
by
or
by
The following example illustrates the concept of convergence in probability.
Let
be a discrete random
variable with
support
and probability mass
function
Consider a sequence of random variables
whose generic term
is
We want to prove that
converges in probability to
.
Take any
.
Note that
When
,
which happens with probability
,
we have
that
and,
of course,
.
When
,
which happens with probability
,
we have
that
and
only if
(or only if
).
Therefore,and
Thus,
trivially converges to
,
because it is identically equal to zero for all
such that
.
Since
was arbitrary, we have obtained the desired result:
for
any
.
The above notion of convergence generalizes to sequences of random vectors in a straightforward manner.
Let
be a sequence of random vectors defined on a
sample space
,
where each random vector
has dimension
.
In the case of random variables, the sequence of random variables
converges in probability if and only if
for
any
,
where
is the distance of
from
.
In the case of random vectors, the definition of convergence in probability
remains the same, but distance is measured by the Euclidean norm of the
difference between the two
vectors:where
the second subscript is used to indicate the individual components of the
vectors
and
.
The following is a formal definition.
Definition
Let
be a sequence of
random vectors defined on a sample space
.
We say that
is convergent in probability to a random vector
defined on
if and only
if
for
any
.
Again,
is called the probability limit of the sequence and convergence is indicated
by
or
by
A sequence of random vectors is convergent in probability if and only if the sequences formed by their entries are convergent.
Proposition
Let
be a sequence of random vectors defined on a sample space
.
Denote by
the sequence of random variables obtained by taking the
-th
entry of each random vector
.
The sequence
converges in probability to the random vector
if and only if the sequence
converges in probability to the random variable
(the
-th
component of
)
for each
.
Below you can find some exercises with explained solutions.
Let
be a random variable having a
uniform distribution on the interval
.
In other words,
is a continuous
random variable with
support
and
probability density
function
Now, define a sequence of random variables
as
follows:
where
is the indicator function of the event
.
Find the probability limit (if it exists) of the sequence
.
A generic term
of the sequence, being an indicator function, can take only two values:
it can take value
with
probability
where
is an integer
satisfying
and
is an integer
satisfying
it can take value
with
probability
By the previous inequality,
goes to infinity as
goes to infinity
and
Therefore,
the probability that
is equal to zero converges to
as
goes to infinity. So, obviously,
converges in probability to the constant random
variable
because,
for any
,
Does the sequence in the previous exercise also converge almost surely?
We can identify the
sample space
with the support of
:
and
the sample points
with the realizations of
:
i.e. when the realization is
,
then
.
Almost sure convergence requires
that
where
is a zero-probability event and the
superscript
denotes the complement of a set. In other words, the set of sample points
for which the sequence
does not converge to
must be included in a zero-probability event
.
In our case, it is easy to see that, for any fixed sample point
,
the sequence
does not converge to
,
because infinitely many terms in the sequence are equal to
.
Therefore,
and,
trivially, there does not exist a zero-probability event including the set
Thus,
the sequence does not converge almost surely to
.
Let
be an IID sequence of continuous
random variables having a uniform distribution with
support
and
probability density
function
Find the probability limit (if it exists) of the sequence
.
As
tends to infinity, the probability density tends to become concentrated around
the point
.
Therefore, it seems reasonable to conjecture that the sequence
converges in probability to the constant random
variable
To
rigorously verify this claim we need to use the formal definition of
convergence in probability. For any
,
Please cite as:
Taboga, Marco (2021). "Convergence in probability", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/asymptotic-theory/convergence-in-probability.
Most of the learning materials found on this website are now available in a traditional textbook format.