Search for probability and statistics terms on Statlect
StatLect

Joint probability mass function

by , PhD

The joint probability mass function is a function that completely characterizes the distribution of a discrete random vector. When evaluated at a given point, it gives the probability that the realization of the random vector will be equal to that point.

Table of Contents

Synonyms and acronyms

The term joint probability function is often used as a synonym. Sometimes, the abbreviation joint pmf is used.

Definition

The following is a formal definition.

Definition Let X be a Kx1 discrete random vector. Its joint probability mass function is a function [eq1] such that[eq2]where [eq3] is the probability that the random vector X takes the value x.

This is a straightforward multi-variate generalization of the definition of the probability mass function of a discrete variable (uni-variate case).

Summary of the differences between the joint probability mass function, which characterizes the distribution of a random vector, and the ordinary probability mass function, which characterizes the distribution of a random variable.

Example

Suppose X is a $2	imes 1$ discrete random vector and that its support (the set of values it can take) is:[eq4]

If the three values have the same probability, then the joint probability mass function is:[eq5]

When the two components of X are denoted by X_1 and X_2, the joint pmf can also be written using the following alternative notation:[eq6]

How to derive the marginals

The joint pmf can be used to derive the marginal probability mass functions of the single entries of the random vector.

Given [eq7], the marginal of X_1 is[eq8]

In order to get the entire marginal, we need to compute [eq9] separately for each $x_{1}$ belonging to the support of $X_{1} $.

Each of the computations involves a sum over all the possible values of X_2 (i.e., over the support $R_{X_{2}}$).

Similarly, the marginal of X_2 is[eq10]

Example

Let us derive the marginal pmf of X_1 from the joint pmf in the previous example.

The supports of X_1 and X_2 are [eq11]

We have[eq12]and[eq13]

Thus, the marginal probability mass function of X_1 is[eq14]

Joint pmf in tabular form

If a random vector has two entries X_1 and X_2, then its joint pmf can be written in tabular form:

In the next example it will become clear why the tabular form is very convenient.

Example of tabular form

Let us put in tabular form the joint pmf used in the previous examples.

x2=0x2=1Marginal of X1
x1=001/31/3
x1=11/31/32/3
Marginal of X21/32/3

We can easily obtain the marginals by summing the probabilities by column and by row.

Conditional and joint pmf

The joint pmf can also be used to derive the conditional probability mass function of the single entries of the random vector.

This is carefully explained and illustrated with examples in the glossary entry on conditional pmfs.

More details

For a thorough discussion of joint pmfs, go to the lecture entitled Random vectors, where discrete random vectors are introduced and you can also find some solved exercises involving joint pmfs.

Keep reading the glossary

Previous entry: Joint probability density function

Next entry: Log likelihood

How to cite

Please cite as:

Taboga, Marco (2021). "Joint probability mass function", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/glossary/joint-probability-mass-function.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.