StatlectThe Digital Textbook
Index > Probability distributions

Beta distribution

The Beta distribution is a continuous probability distribution having two parameters. One of its most common uses is to model one's uncertainty about the probability of success of an experiment.

Suppose a probabilistic experiment can have only two outcomes, either success, with probability X, or failure, with probability $1-X$. Suppose also that X is unknown and all its possible values are deemed equally likely. This uncertainty can be described by assigning to X a uniform distribution on the interval $left[ 0,1
ight] $. This is appropriate because X, being a probability, can take only values between 0 and 1; furthermore, the uniform distribution assigns equal probability density to all points in the interval, which reflects the fact that no possible value of X is, a priori, deemed more likely than all the others. Now, suppose that we perform n independent repetitions of the experiment and we observe k successes and $n-k$ failures. After performing the experiments, we naturally want to know how we should revise the distribution initially assigned to X, in order to properly take into account the information provided by the observed outcomes. In other words, we want to calculate the conditional distribution of X, conditional on the number of successes and failures we have observed. The result of this calculation is a Beta distribution. In particular, the conditional distribution of X, conditional on having observed k successes out of n trials, is a Beta distribution with parameters $k+1$ and $n-k+1$.

Definition

The Beta distribution is characterized as follows.

Definition Let X be an absolutely continuous random variable. Let its support be the unit interval:[eq1]Let [eq2]. We say that X has a Beta distribution with shape parameters $lpha $ and $eta $ if its probability density function is[eq3]where $Bleft( {}
ight) $ is the Beta function.

A random variable having a Beta distribution is also called a Beta random variable.

The following is a proof that [eq4] is a legitimate probability density function.

Proof

Non-negativity descends from the facts that [eq5] is non-negative when [eq6] and [eq7], and that [eq8] is strictly positive (it is a ratio of Gamma functions, which are strictly positive when their arguments are strictly positive - see the lecture entitled Gamma function). That the integral of [eq9] over R equals 1 is proved as follows:[eq10]where we have used the integral representation [eq11]a proof of which can be found in the lecture entitled Beta function.

Expected value

The expected value of a Beta random variable X is[eq12]

Proof

It can be derived as follows:[eq13]

Variance

The variance of a Beta random variable X is[eq14]

Proof

It can be derived thanks to the usual variance formula ([eq15]):[eq16]

Higher moments

The k-th moment of a Beta random variable X is[eq17]

Proof

By the definition of moment, we have[eq18]

where in step $rame{A}$ we have used recursively the fact that [eq19].

Moment generating function

The moment generating function of a Beta random variable X is defined for any $t$ and it is[eq20]

Proof

By using the definition of moment generating function, we obtain[eq21]Note that the moment generating function exists and is well defined for any $t$ because the integral[eq22]is guaranteed to exist and be finite, since the integrand[eq23]is continuous in x over the bounded interval $left[ 0,1
ight] $.

The above formula for the moment generating function might seem impractical to compute, because it involves an infinite sum as well as products whose number of terms increase indefinitely. However, the function[eq24]is a function, called Confluent hypergeometric function of the first kind, that has been extensively studied in many branches of mathematics. Its properties are well-known and efficient algorithms for its computation are available in most software packages for scientific computation.

Characteristic function

The characteristic function of a Beta random variable X is[eq25]

Proof

The derivation of the characteristic function is almost identical to the derivation of the moment generating function (just replace $t$ with $it$ in that proof).

Comments made about the moment generating function, including those about the computation of the Confluent hypergeometric function, apply also to the characteristic function, which is identical to the mgf except for the fact that $t$ is replaced with $it$.

Distribution function

The distribution function of a Beta random variable X is[eq26]where the function[eq27]is called incomplete Beta function and is usually computed by means of specialized computer algorithms.

Proof

For $x<0$, [eq28], because X cannot be smaller than 0. For $x>1$, [eq29] because X is always smaller than or equal to 1. For [eq30],[eq31]

More details

In the following subsections you can find more details about the Beta distribution.

Relation to the uniform distribution

The following proposition states the relation between the Beta and the uniform distributions.

Proposition A Beta distribution with parameters $lpha =1$ and $eta =1$ is a uniform distribution on the interval $left[ 0,1
ight] $.

Proof

When $lpha =1$ and $eta =1$, we have that [eq32]Therefore, the probability density function of a Beta distribution with parameters $lpha =1$ and $eta =1$ can be written as [eq33]But the latter is the probability density function of a uniform distribution on the interval $left[ 0,1
ight] $.

Relation to the binomial distribution

The following proposition states the relation between the Beta and the binomial distributions.

Proposition Suppose X is a random variable having a Beta distribution with parameters $lpha $ and $eta $. Let Y be another random variable such that its distribution conditional on X is a binomial distribution with parameters n and X. Then, the conditional distribution of X given Y=y is a Beta distribution with parameters $lpha +y$ and $eta +n-y$.

Proof

We are dealing with one continuous random variable X and one discrete random variable Y (together, they form what is called a random vector with mixed coordinates). With a slight abuse of notation, we will proceed as if also Y were continuous, treating its probability mass function as if it were a probability density function. Rest assured that this can be made fully rigorous (by defining a probability density function with respect to a counting measure on the support of Y). By assumption Y has a binomial distribution conditional on X, so that its conditional probability mass function is [eq34]where $inom{n}{y}$ is a binomial coefficient. Also, by assumption X has a Beta distribution, so that is probability density function is[eq35]Therefore, the joint probability density function of X and Y is [eq36]Thus, we have factored the joint probability density function as[eq37]where[eq38]is the probability density function of a Beta distribution with parameters $lpha +y$ and $eta +n-y$, and the function [eq39] does not depend on x. By a result proved in the lecture entitled Factorization of joint probability density functions, this implies that the probability density function of X given Y=y is[eq40]Thus, as we wanted to demonstrate, the conditional distribution of X given Y=y is a Beta distribution with parameters $lpha +y$ and $eta +n-y$.

By combining this proposition and the previous one, we obtain the following corollary.

Proposition Suppose X is a random variable having a uniform distribution. Let Y be another random variable such that its distribution conditional on X is a binomial distribution with parameters n and X. Then, the conditional distribution of X given Y=y is a Beta distribution with parameters $1+y$ and $1+n-y$.

This proposition constitutes a formal statement of what we said in the introduction of this lecture in order to motivate the Beta distribution. Remember that the number of successes obtained in n independent repetitions of a random experiment having probability of success X is a binomial random variable with parameters X and n. According to the proposition above, when the probability of success X is a priori unknown and all possible values of X are deemed equally likely (they have a uniform distribution), observing the outcome of the n experiments leads us to revise the distribution assigned to X, and the result of this revision is a Beta distribution.

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

A production plant produces items that have a probability X of being defective. The plant manager does not know X, but from past experience she expects this probability to be equal to $4%$. Furthermore, she quantifies her uncertainty about X by attaching a standard deviation of $2%$ to her $4%$ estimate. After consulting with an expert in statistics, the manager decides to use a Beta distribution to model her uncertainty about X. How should she set the two parameters of the distribution in order to match her priors about the expected value and the standard deviation of X?

Solution

We know that the expected value of a Beta random variable with parameters $lpha $ and $eta $ is[eq41]while its variance is[eq42]The two parameters need to be set in such a way that[eq43]This is accomplished by finding a solution to the following system of two equations in two unknowns:[eq44]where for notational convenience we have set $mu =0.04$ and $sigma ^{2}=0.0004$. The first equation gives[eq45]or[eq46]By substituting this into the second equation, we get[eq47]or[eq48]Then we divide the numerator and denominator on the left-hand side by $lpha ^{2}$:[eq49]By computing the products, we get[eq50]By taking the reciprocals of both sides, we have[eq51]By multiplying both sides by $mu ^{3}$, we obtain[eq52]Thus the value of $lpha $ is[eq53]and the value of $eta $ is[eq54]By plugging our numerical values into the two formulae, we obtain[eq55]

Exercise 2

After choosing the parameters of the Beta distribution so as to represent her priors about the probability of producing a defective item (see previous exercise), the plant manager now wants to update her priors by observing new data. She decides to inspect a production lot of 100 items, and she finds that 3 of the items in the lot are defective. How should she change the parameters of the Beta distribution in order to take this new information into account?

Solution

Under the hypothesis that the items are produced independently of each other, the result of the inspection is a binomial random variable with parameters $n=100$ and $p=X$. But updating a Beta distribution based on the outcome of a binomial random variable gives as a result another Beta distribution. Moreover, the two parameters $lpha _{1}$ and $eta _{1}$ of the updated Beta distribution are[eq56]

Exercise 3

After updating the parameters of the Beta distribution (see previous exercise), the plant manager wants to compute again the expected value and the standard deviation of the probability of finding a defective item. Can you help her?

Solution

We just need to use the formulae for the expected value and the variance of a Beta distribution:[eq57]and plug in the new values we have found for $lpha $ and $eta $, that is,[eq58]The result is[eq59]

The book

Most learning materials found on this website are now available in a traditional textbook format.