Search for probability and statistics terms on Statlect

Continuous random variable

by , PhD

A continuous random variable has two main characteristics:

  1. the set of its possible values is uncountable;

  2. we compute the probability that its value will belong to a given interval by integrating a function called probability density function.

On this page we provide a definition of continuous variable, we explain it in great detail, we provide several examples and we derive some interesting properties.

Table of Contents


Here is a formal definition.

Definition A random variable X is said to be continuous if and only if the probability that it will belong to an interval $left[ a,b
ight] $ can be expressed as an integral:[eq1]where the integrand function [eq2] is called the probability density function of X.

Dealing with integrals

If you are not familiar with integrals, you can read the lecture on the basics of integration.

It is anyway important to remember that an integral is used to compute an area under a curve.

In the definition of a continuous variable, the integral is the area under the probability density function in the interval between a and $b$.

The integral of a probability density function is the area under its plot between the two bounds of integration.

Probabilities are assigned to intervals

The first thing to note in the definition above is that the distribution of a continuous variable is characterized by assigning probabilities to intervals of numbers.

Contrast this with the fact that the distribution of a discrete variable is characterized by assigning probabilities to single numbers.

Before explaining why the distribution of a continuos variable is assigned by intervals, we make some examples and discuss some of its mathematical properties.


Here are some examples.

Example 1

Let X be a continuous random variable that can take any value in the interval $left[ 0,1
ight] $.

Let its probability density function be[eq3]

Then, for example, the probability that X takes a value between $1/2$ and 1 can be computed as follows:[eq4]

Example 2

Let X be a continuous random variable that can take any value in the interval $left[ 0,3
ight] $ with probability density function[eq5]

The probability that the realization of X will belong to the interval $left[ 0,1
ight] $ is[eq6]

Cumulative distribution function

As a consequence of the definition above, the cumulative distribution function of a continuous variable $X~$is[eq7]

The derivative of an integral with respect to its upper bound of integration is equal to the integrand function. Therefore,[eq8]

In other words, the probability density function [eq9] is the derivative of the cumulative distribution function [eq10].

All the realizations have zero probability

Any single realization x has zero probability of being observed because[eq11]

Therefore, in a continuous setting zero-probability events are not events that never happen. In fact, they do happen all the time: all the possible values have zero probability, but one of them must be the realized value.

This property, which may seem paradoxical, is discussed in the lecture on zero-probability events.

The support is uncountable

Another consequence of the definition given above is that the support of a continuous random variable must be uncountable.

In fact, by the previous property, if the support R_X (the set of values the variable can take) was countable, then we would have[eq12]which is a contradiction because the probability that a random variable takes at least one of all its possible values must be 1.

Continuous vs discrete

In order to sharpen our understanding of continuous variables, let us highlight the main differences with respect to discrete variables found so far.

The main characteristics of a discrete variable are:

By contrast, the main characteristics of a continuous random variable are:

Infographic summarizing the main differences beween discrete and continuous variables.

Understanding the definition

Why do we define a mathematical object that has such a counterintuitive property (all possible values have zero probability)?

The short answer is that we do it for mathematical convenience.

Suppose that we are trying to model a certain variable that we see as random, for example, the proportion of atoms that exhibit a certain behavior in a physics experiment.

In general, a proportion is a number in the interval $left[ 0,1
ight] $.

Exploring inconvenient alternatives - Enumeration of the possible values

If we knew exactly the total number $N$ of atoms involved in the experiment, then we would also know that the proportion X could take the values[eq13]

However, in many cases the exact number of atoms involved in an experiment is not only huge, but also not known precisely.

What do we do in such cases? Can we enumerate all the possible values of X?

Theoretically, we could write down the list in (1) for every value of $N$ that we deem possible and then take the union of all the lists.

The resulting union would be a finite support for our random variable X, to which we would then need to assign probabilities.

Given that the possible values of $N$ are likely in the trillions, such an approach would be highly impractical.

Exploring inconvenient alternatives - All the rational numbers

An alternative is to consider the set of all rational numbers belonging to the interval $left[ 0,1
ight] $.

Remember that a rational number is the ratio of two integers. As a consequence, the set of rational numbers in $left[ 0,1
ight] $ includes all the possible values of the proportion X. Moreover, it is a countable set.

Thus, we can use a probability mass function to assign probabilities to it, without resorting to exotic density functions.

Unfortunately, this approach works only in special cases.

For example, suppose that all the possible values of X are deemed to be equally likely. There is no way to assign equal probabilities to all the values in the set of rational numbers in $left[ 0,1
ight] $ because it contains infinitely many numbers (the probability of a single number should be $1/infty =0$, which does not work).

A more convenient alternative - Intervals of real numbers

The third alternative is provided by continuous random variables.

We can consider the whole interval of real numbers $left[ 0,1
ight] $ and assign probabilities to its sub-intervals using a probability density function.

In the case in which all the values are deemed equally likely, we use a constant probability density function, equal to 1 on the whole interval (called a uniform distribution).

This brilliantly solves the problem, although we need to accept the fact that the question "What is the probability that X will take a specific value x?" does not make much sense any longer.

The questions that we can still ask are of the kind "What is the probability that X will take a value close to x?" provided that we define precisely what we mean by close in terms of an interval (e.g., [eq14] where epsilon is an accuracy parameter that we define).

Other explanations

You can find other explanations and examples that help to understand the definition of continuous variable in:

Expected value

The expected value of a continuous random variable is calculated as[eq15]

See the lecture on the expected value for explanations and examples.

Higher moments and functions

The moments of a continuous variable can be computed as[eq16]and the expected value of a transformation $g$ is[eq17]


The variance can be computed by first calculating moments as above and then using the variance formula[eq18]

Conditional expectation

The conditional expected value of a continuous random variable can be calculated using its conditional density function (see the lecture on conditional expectation for details and examples).

Common continuous distributions

The next table contains some examples of continuous distributions that are frequently encountered in probability theory and statistics.

Name of the continuous distribution Support
Uniform All the real numbers in the interval [0,1]
Normal The whole set of real numbers
Chi-square The set of all non-negative real numbers


Continuous random variables are sometimes also called absolutely continuous.

More details

Continuous random variables are discussed also in:

Multivariate generalizations of the concept are presented here:

Keep reading the glossary

Next entry: Absolutely continuous random vector

How to cite

Please cite as:

Taboga, Marco (2021). "Continuous random variable", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.