This lecture discusses how to derive the distribution of the sum of two independent random variables.
We explain:
first, how to work out the cumulative distribution function of the sum;
then, how to compute its probability mass function (if the summands are discrete) or its probability density function (if the summands are continuous).
The next proposition characterizes the cumulative distribution function (cdf) of the sum.
Proposition Let and be two independent random variables. Denote their cdfs by and . Letand denote the cdf of by . Then,or
The first formula is derived as follows:The second formula is symmetric to the first.
Example Let be a uniform random variable with support and probability density functionLet be another uniform random variable, independent of , with support and probability density functionThe cdf of isThe cdf of isThere are four cases to consider:
If , then
If , then
If , then
If , then
By combining these four possible cases, we obtain
When the two summands are discrete random variables, the probability mass function (pmf) of their sum can be derived as follows.
Proposition Let and be two independent discrete random variables. Denote their respective pmfs by and , and their supports by and . Letand denote the pmf of by . Then,or
The first formula is derived as follows:The second formula is symmetric to the first.
The two summations above are called convolutions (of two pmfs).
Example Let be a discrete random variable with support and pmfLet be another discrete random variable, independent of , with support and pmfDefineIts support is The pmf of , evaluated at isEvaluated at , it isEvaluated at , it isTherefore, the pmf of is
When the two summands are continuous variables, the probability density function (pdf) of their sum can be derived as follows.
Proposition Let and be two independent continuous random variables and denote their respective pdfs by and . Letand denote the pdf of by . Then,or
The distribution function of a sum of independent variables isDifferentiating both sides and using the fact that the density function is the derivative of the distribution function, we obtainThe second formula is symmetric to the first.
The two integrals above are called convolutions (of two pdfs).
Example Let be an exponential random variable with support and pdfLet be another exponential random variable, independent of , with support and pdfDefine The support of isWhen , the pdf of isTherefore, the pdf of is
We have discussed above how to work out the distribution of the sum of two independent random variables.
How do we derive the distribution of the sum of more than two mutually independent random variables?
Suppose that , , ..., are mutually independent random variables and let be their sum:
The distribution of can be derived recursively, using the results for sums of two random variables given above:
first, defineand compute the distribution of ;
then, defineand compute the distribution of ;
and so on, until the distribution of can be computed from
Below you can find some exercises with explained solutions.
Let be a uniform random variable with support and pdf
Let be an exponential random variable, independent of , with support and pdf
Derive the pdf of the sum
The support of isWhen , the pdf of isTherefore, the pdf of is
Let be a discrete random variable with support and pmf
Let be another discrete random variable, independent of , with support and pmf
Derive the pmf of the sum
The support of is The pmf of , evaluated at isEvaluated at , it isEvaluated at , it isEvaluated at , it isTherefore, the pmf of is
Please cite as:
Taboga, Marco (2021). "Sums of independent random variables", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/sums-of-independent-random-variables.
Most of the learning materials found on this website are now available in a traditional textbook format.