Search for probability and statistics terms on Statlect
StatLect

Functions of random vectors and their distribution

by , PhD

Let X be a Kx1 random vector with known distribution. Let a $L	imes 1$ random vector Y be a function of X:[eq1]where [eq2]. How do we derive the distribution of Y from the distribution of X?

Although there is no general answer to this question, there are some special cases in which the distribution of Y can be easily derived from the distribution of X. We discuss these cases below.

Table of Contents

One-to-one functions

In the cases in which the function g(x) is one-to-one (hence invertible) and the random vector X is either discrete or continuous, there are readily applicable formulae for the distribution of Y. We report these formulae below.

One-to-one function of a discrete random vector

When X is a discrete random vector the joint probability mass function of [eq3] is given by the following proposition.

Proposition (probability mass of a one-to-one function) Let X be a Kx1 discrete random vector with support R_X and joint probability mass function [eq4]. Let [eq5] be one-to-one on the support of X. Then, the support of [eq6] is[eq7]and its probability mass function is[eq8]

Proof

If $yin R_{Y}$, then[eq9]If $y
otin R_{Y}$, then trivially [eq10].

Example Let X be a $2	imes 1$ discrete random vector and denote its components by X_1 and X_2. Let the support of X be [eq11]and its joint probability mass function be[eq12]Let[eq13]The support of Y is[eq14]The inverse function is[eq15]The joint probability mass function of Y is[eq16]

One-to-one function of a continuous random vector

When X is a continuous random vector and $g$ is differentiable, then also Y is continuous and its joint probability density function is given by the following proposition.

Proposition (density of a one-to-one function) Let X be a Kx1 continuous random vector with support R_X and joint probability density function [eq17]. Let [eq5] be one-to-one and differentiable on the support of X. Denote by [eq19] the Jacobian matrix of [eq20], i.e.,[eq21]where $y_{i}$ is the i-th component of $y$ and $x_{i}$ is the i-th component of [eq22]. Then, the support of [eq23] is[eq7]If the determinant of the Jacobian matrix satisfies[eq25]then the joint probability density function of Y is[eq26]

Proof

See: Poirier, D. J. (1995) Intermediate statistics and econometrics: a comparative approach, MIT Press.

A special case of the above proposition obtains when the function $g$ is a linear one-to-one mapping.

Proposition Let X be a Kx1 continuous random vector with joint probability density [eq27]. Let Y be a Kx1 random vector such that[eq28]where mu is a constant Kx1 vector and Sigma is a constant $K	imes K$ invertible matrix. Then, Y is a continuous random vector whose probability density function [eq29] satisfies[eq30]where [eq31] is the determinant of Sigma.

Proof

In this case the inverse function is[eq32]The Jacobian matrix is[eq33]When $yin R_{Y}$ the joint density of Y is[eq34]

Example Let X be a $2	imes 1$ random vector with support[eq35]and joint probability density function[eq36]where $x_{1}$ and $x_{2}$ are the two components of x. Define a $2	imes 1$ random vector [eq37] with components $Y_{1}$ and $Y_{2}$ as follows:[eq38]The inverse function [eq20] is defined by[eq40]The Jacobian matrix of [eq20] is[eq42]Its determinant is[eq43]The support of $Y_{1}$ is[eq44]The support of $Y_{2}$ is[eq45]and the support of Y is[eq46]For $yin R_{Y}$, the joint probability density function of Y is[eq47]while for $y
otin R_{Y}$, the joint probability density function is [eq48].

Independent sums

When the components of X are independent and[eq49]then the distribution of [eq50] can be derived using the convolution formulae illustrated in the lecture entitled Sums of independent random variables.

Known moment generating function

The joint moment generating function of [eq50], provided it exists, can be computed as[eq52]using the transformation theorem. If [eq53] is recognized as the joint moment generating function of a known distribution, then such a distribution is the distribution of Y (two random vectors have the same distribution if and only if they have the same joint moment generating function, provided the latter exists).

Known characteristic function

The joint characteristic function of [eq50] can be computed as[eq55]using the transformation theorem. If [eq56] is recognized as the joint characteristic function of a known distribution, then such a distribution is the distribution of Y (two random vectors have the same distribution if and only if they have the same joint characteristic function).

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Let X_1 be a uniform random variable with support[eq57]and probability density function[eq58]Let X_2 be a continuous random variable, independent of X_1, with support[eq59]and probability density function[eq60]Let [eq61]Find the joint probability density function of the random vector [eq62]

Solution

Since X_1 and X_2 are independent, their joint probability density function is equal to the product of their marginal density functions:[eq63]The support of $Y_{1}$ is[eq64]and the support of $Y_{2}$ is[eq65]The support of Y is[eq66]The function [eq67] is one-to-one and its inverse [eq68] is defined by[eq69]with Jacobian matrix[eq70]The determinant of the Jacobian matrix is[eq71]which is different from zero for any $y$ belonging to $R_{Y}$. The formula for the joint probability density function of Y is[eq72]and[eq73]which implies[eq74]

Exercise 2

Let X be a $2	imes 1$ random vector with support[eq75]and joint probability density function[eq76]where $x_{1}$ and $x_{2}$ are the two components of x. Define a $2	imes 1$ random vector [eq50] with components $Y_{1}$ and $Y_{2}$ as follows:[eq78]Find the joint probability density function of the random vector Y.

Solution

The inverse function [eq20] is defined by[eq80]The Jacobian matrix of [eq20] is[eq82]Its determinant is[eq83]The support of $Y_{1}$ is[eq84]The support of $Y_{2}$ is[eq85]The support of Y is[eq86]For $yin R_{Y}$, the joint probability density function of Y is[eq87]while for $y
otin R_{Y}$, the joint probability density function is [eq88].

How to cite

Please cite as:

Taboga, Marco (2021). "Functions of random vectors and their distribution", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/functions-of-random-vectors.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.