The joint probability mass function is a function that completely characterizes the distribution of a discrete random vector. When evaluated at a given point, it gives the probability that the realization of the random vector will be equal to that point.

The term **joint probability function** is often used as a
synonym. Sometimes, the abbreviation **joint pmf** is used.

The following is a formal definition.

Definition Let be a discrete random vector. Its joint probability mass function is a function such thatwhere is the probability that the random vector takes the value .

Suppose is a discrete random vector and that its support (the set of values it can take) is:If the three values have the same probability, then the joint probability mass function is:Denoting the two components of by and , its joint pmf can also be written using the following alternative notation:

This is a glossary entry. For a thorough discussion of joint pmfs, go to the
lecture entitled Random
vectors, where discrete random vectors are introduced and you can also
find some **exercises** involving joint pmfs.

Previous entry: Joint probability density function

Next entry: Loss function

The book

Most learning materials found on this website are now available in a traditional textbook format.

Featured pages

- Chi-square distribution
- Hypothesis testing
- Mean square convergence
- Bayes rule
- Bernoulli distribution
- Multivariate normal distribution

Explore

Main sections

- Mathematical tools
- Fundamentals of probability
- Probability distributions
- Asymptotic theory
- Fundamentals of statistics
- Glossary

About

Glossary entries

- Binomial coefficient
- Distribution function
- Probability space
- Critical value
- Null hypothesis
- Type I error

Share