Consider an experiment having two possible outcomes: either success or failure. Suppose the experiment is repeated several times and the repetitions are independent of each other. The total number of experiments where the outcome turns out to be a success is a random variable whose distribution is called binomial distribution. The distribution has two parameters: the number of repetitions of the experiment and the probability of success of an individual experiment.

A binomial distribution can be seen as a sum of mutually independent Bernoulli random variables that take value 1 in case of success of the experiment and value 0 otherwise. This connection between the binomial and Bernoulli distributions will be illustrated in detail in the remainder of this lecture and will be used to prove several properties of the binomial distribution.

The binomial distribution is characterized as follows.

Definition
Let
be a discrete random
variable. Let
and
.
Let the support of
beWe
say that
has a **binomial distribution** with parameters
and
if its probability mass
function
iswhere
is a binomial coefficient.

The following is a proof that is a legitimate probability mass function.

Proof

Non-negativity is obvious. We need to prove that the sum of over its support equals . This is proved as follows:where we have used the usual formula for binomial expansions:

The binomial distribution is intimately related to the Bernoulli distribution. The following propositions show how.

Proposition If a random variable has a binomial distribution with parameters and , with , then has a Bernoulli distribution with parameter .

Proof

The probability mass function of isbutandTherefore, the probability mass function can be written aswhich is the probability mass function of a Bernoulli random variable.

Proposition If a random variable has a binomial distribution with parameters and , then is a sum of jointly independent Bernoulli random variables with parameter .

Proof

We prove it by induction. So, we have to prove that it is true for and for a generic , given that it is true for . For , it has been proved in the proposition above (the binomial distribution with parameter is a Bernoulli distribution). Now, suppose the claim is true for a generic . We have to verify that is a binomial random variable, whereand , , , are independent Bernoulli random variables. Since the claim is true for , this is tantamount to verifying thatis a binomial random variable, where has a binomial distribution with parameters and Using the convolution formula, we can compute the probability mass function of : If , thenwhere the last equality is the recursive formula for binomial coefficients. If , thenFinally, if , thenTherefore, for we haveand:which is the probability mass function of a binomial random variable with parameters and . This completes the proof.

The expected value of a binomial random variable is

Proof

It can be derived as follows:

The variance of a binomial random variable is

Proof

Representing as a sum of jointly independent Bernoulli random variables, we get

The moment generating function of a binomial random variable is defined for any :

Proof

This is proved as follows:Since the moment generating function of a Bernoulli random variable exists for any , also the moment generating function of a binomial random variable exists for any .

The characteristic function of a binomial random variable is

Proof

Again, we are going to use the fact that a binomial random variable with parameter is a sum of independent Bernoulli random variables:

The distribution function of a binomial random variable iswhere is the floor of , i.e. the largest integer not greater than .

Proof

For , , because cannot be smaller than . For , , because is always smaller than or equal to . For :

Values of are usually computed by computer algorithms. For example, the MATLAB command

`binocdf(x,n,p)`

returns the value of the distribution function at the point
`x`

when the parameters of the distribution are
`n`

and `p`

.

Below you can find some exercises with explained solutions.

Suppose you independently flip a coin times and the outcome of each toss can be either head (with probability ) or tails (also with probability ). What is the probability of obtaining exactly tails?

Solution

Denote by the number of times the outcome is tails (out of the tosses). has a binomial distribution with parameters and . The probability of obtaining exactly tails can be computed from the probability mass function of as follows:

Suppose you independently throw a dart times. Each time you throw a dart, the probability of hitting the target is . What is the probability of hitting the target less than times (out of the total times you throw a dart)?

Solution

Denote by the number of times you hit the target. has a binomial distribution with parameters and . The probability of hitting the target less than times can be computed from the distribution function of as follows:and the value of can be calculated with a computer algorithm, for example with the MATLAB command

The book

Most of the learning materials found on this website are now available in a traditional textbook format.

Featured pages

- Statistical inference
- Central Limit Theorem
- Delta method
- Gamma distribution
- Law of Large Numbers
- Independent events

Explore

Main sections

- Mathematical tools
- Fundamentals of probability
- Probability distributions
- Asymptotic theory
- Fundamentals of statistics
- Glossary

About

Glossary entries

Share