Given a random vector, the probability distribution of all its components, considered together, is called joint distribution, while the probability distribution of one of its components, considered in isolation, is called marginal distribution.

The following is a more precise definition.

Definition Let be the -th component of a random vector having joint distribution function . The distribution function of is called marginal distribution function of and it is denoted by .

Marginal distribution functions play an important role in the characterization of independence between random variables: two random variables are independent if and only if their joint distribution function is equal to the product of their marginal distribution functions (see the lecture entitled Independent random variables).

Example Let and be two random variables having marginal distribution functionsand joint distribution functionIt is easy to check that for any and , which implies that and are independent.

A more detailed discussion of the marginal distribution function can be found in the lecture entitled Random vectors.

Previous entry: Loss function

Next entry: Marginal probability density function

The book

Most learning materials found on this website are now available in a traditional textbook format.

Featured pages

Explore

Main sections

- Mathematical tools
- Fundamentals of probability
- Probability distributions
- Asymptotic theory
- Fundamentals of statistics
- Glossary

About

Glossary entries

- Alternative hypothesis
- Null hypothesis
- Convolutions
- Mean squared error
- Integrable variable
- Type II error

Share