Two random variables are independent if they convey no information about each other and, as a consequence, receiving information about one of the two does not change our assessment of the probability distribution of the other.
This lecture provides a formal definition of independence and discusses how to verify whether two or more random variables are independent.
Table of contents
Recall (see the lecture entitled Independent events) that two events and are independent if and only if
This definition is extended to random variables as follows.
Definition Two random variables and are said to be independent if and only iffor any couple of events and , where and .
In other words, two random variables are independent if and only if the events related to those random variables are independent events.
The independence between two random variables is also called statistical independence.
Checking the independence of all possible couples of events related to two random variables can be very difficult. This is the reason why the above definition is seldom used to verify whether two random variables are independent. The following criterion is more often used instead.
Proposition Two random variables and are independent if and only ifwhere is their joint distribution function and and are their marginal distribution functions/.
By using some facts from measure theory (not proved here), it is possible to demonstrate that, when checking for the conditionit is sufficient to confine attention to sets and taking the formThus, two random variables are independent if and only ifUsing the definitions of joint and marginal distribution function, this condition can be written as
Example Let and be two random variables with marginal distribution functionsand joint distribution function and are independent if and only if which is straightforward to verify. When or , then When and , then:
When the two variables, taken together, form a discrete random vector, independence can also be verified using the following proposition:
Proposition Two random variables and , forming a discrete random vector, are independent if and only ifwhere is their joint probability mass function and and are their marginal probability mass functions.
The following example illustrates how this criterion can be used.
Example Let be a discrete random vector with support Let its joint probability mass function beIn order to verify whether and are independent, we first need to derive the marginal probability mass functions of and . The support of isand the support of isWe need to compute the probability of each element of the support of :Thus, the probability mass function of isWe need to compute the probability of each element of the support of :Thus, the probability mass function of isThe product of the marginal probability mass functions iswhich is obviously different from . Therefore, and are not independent.
When the two variables, taken together, form a continuous random vector, independence can also be verified by means of the following proposition.
Proposition Two random variables and , forming a continuous random vector, are independent if and only ifwhere is their joint probability density function and and are their marginal probability density functions.
The following example illustrates how this criterion can be used.
Example Let the joint probability density function of and beIts marginals areandVerifying that is straightforward. When or , then . When and , then
The following subsections contain more details about statistical independence.
The definition of mutually independent random variables extends the definition of mutually independent events to random variables.
Definition We say that random variables , ..., are mutually independent (or jointly independent) if and only if for any sub-collection of random variables , ..., (where ) and for any collection of events , where .
In other words, random variables are mutually independent if the events related to those random variables are mutually independent events.
Denote by a random vector whose components are , ..., . The above condition for mutual independence can be replaced:
in general, by a condition on the joint distribution function of :
for discrete random variables, by a condition on the joint probability mass function of :
for continuous random variables, by a condition on the joint probability density function of :
It can be proved that random variables , ..., are mutually independent if and only iffor any functions , ..., such that the above expected values exist and are well-defined.
If two random variables and are independent, then their covariance is zero:
This is an immediate consequence of the fact that, if and are independent, then(see the Mutual independence via expectations property above). When and are identity functions ( and ), thenTherefore, by the covariance formula:
The converse is not true: two random variables that have zero covariance are not necessarily independent.
The above notions are easily generalized to the case in which and are two random vectors, having dimensions and respectively. Denote their joint distribution functions by and and the joint distribution function of and together by Also, if the two vectors are discrete or continuous replace with or to denote the corresponding probability mass or density functions.
Definition Two random vectors and are independent if and only if one of the following equivalent conditions is satisfied:
Condition 1:for any couple of events and , where and :
Condition 2:for any and (replace with or when the distributions are discrete or continuous respectively)
Condition 3:for any functions and such that the above expected values exist and are well-defined.
Also the definition of mutual independence extends in a straightforward manner to random vectors.
Definition We say that random vectors , ..., are mutually independent (or jointly independent) if and only iffor any sub-collection of random vectors , ..., (where ) and for any collection of events .
All the equivalent conditions for the joint independence of a set of random variables (see above) apply with obvious modifications also to random vectors.
Below you can find some exercises with explained solutions.
Consider two random variables and having marginal distribution functionsIf and are independent, what is their joint distribution function?
For and to be independent, their joint distribution function must be equal to the product of their marginal distribution functions:
Let be a discrete random vector with support:Let its joint probability mass function beAre and independent?
In order to verify whether and are independent, we first need to derive the marginal probability mass functions of and . The support of isand the support of isWe need to compute the probability of each element of the support of :Thus, the probability mass function of isWe need to compute the probability of each element of the support of :Thus, the probability mass function of isThe product of the marginal probability mass functions iswhich is equal to . Therefore, and are independent.
Let be a continuous random vector with support and its joint probability density function beAre and independent?
The support of isWhen , the marginal probability density function of is , while, when , the marginal probability density function of isThus, summing up, the marginal probability density function of isThe support of isWhen , the marginal probability density function of is , while, when , the marginal probability density function of isThus, the marginal probability density function of isVerifying that is straightforward. When or , then . When and , thenThus, and are independent.
Please cite as:
Taboga, Marco (2021). "Independent random variables", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/independent-random-variables.
Most of the learning materials found on this website are now available in a traditional textbook format.