The joint distribution function completely characterizes the probability distribution of a random vector . When evaluated at the point , it gives the probability that each component of takes on a value smaller than or equal to the respective component of .

It is also called **joint cumulative distribution function**
(abbreviated as **joint cdf**).

The following is a formal definition.

Definition The joint distribution function of a random vector is a function such that:where the components of and are denoted by and respectively, for .

The joint distribution function can be used to the derive the marginal distributions of the single components of the random vector (see Random vectors). It is also used to check whether two ore more random variables are independent (see Independent random variables).

More details about joint distribution functions can be found in the lecture entitled Random vectors.

Previous entry: Integrable random variable

Next entry: Joint probability density function

The book

Most learning materials found on this website are now available in a traditional textbook format.

Featured pages

- Wald test
- Moment generating function
- Exponential distribution
- Multivariate normal distribution
- Chi-square distribution
- Characteristic function

Explore

Main sections

- Mathematical tools
- Fundamentals of probability
- Probability distributions
- Asymptotic theory
- Fundamentals of statistics
- Glossary

About

Glossary entries

- Type II error
- Mean squared error
- Critical value
- Probability space
- Alternative hypothesis
- Convolutions

Share