# Gamma distribution

The Gamma distribution is a generalization of the Chi-square distribution.

It plays a fundamental role in statistics because estimators of variance often have a Gamma distribution.

## Caveat

There are several equivalent parametrizations of the Gamma distribution.

We present one that is particularly convenient in Bayesian applications, and we discuss how it maps to alternative parametrizations.

In our presentation, a Gamma random variable has two parameters:

• the mean parameter , which determines the expected value of the distribution:

• the degrees-of-freedom parameter , which determines the variance of the distribution together with :

## How it arises

Let be independent normal random variables with zero mean and unit variance.

The variable has a Chi-square distribution with degrees of freedom.

If is a strictly positive constant, then the random variable defined as has a Gamma distribution with parameters and .

Therefore, a Gamma variable with parameters and can also be written as the sum of the squares of independent normals having zero mean and variance equal to :

In general, the sum of independent squared normal variables that have zero mean and arbitrary variance has a Gamma distribution.

Yet another way to see is as the sample variance of normal variables with zero mean and variance :

## Definition

Gamma random variables are characterized as follows.

Definition Let be a continuous random variable. Let its support be the set of positive real numbers:Let . We say that has a Gamma distribution with parameters and if and only if its probability density function iswhere is a constant:and is the Gamma function.

To better understand the Gamma distribution, you can have a look at its density plots.

## Alternative parametrizations

Here we discuss two alternative parametrizations reported on Wikipedia. You can safely skip this section on a first reading.

The first alternative parametrization is obtained by setting and , under which:

• the density on the support is

• the mean is

• the variance is

The second alternative parametrization is obtained by setting and , under which:

• the density on the support is

• the mean is

• the variance is

Although these two parametrizations yield more compact expressions for the density, the one we present often generates more readable results when it is used in Bayesian statistics and in variance estimation.

## Expected value

The expected value of a Gamma random variable is

Proof

The mean can be derived as follows:

## Variance

The variance of a Gamma random variable is

Proof

It can be derived thanks to the usual variance formula ():

## Moment generating function

The moment generating function of a Gamma random variable is defined for any :

Proof

By using the definition of moment generating function, we obtainwhere the integral equals because it is the integral of the probability density function of a Gamma random variable with parameters and . Thus,Of course, the above integrals converge only if , i.e. only if . Therefore, the moment generating function of a Gamma random variable exists for all .

## Characteristic function

The characteristic function of a Gamma random variable is

Proof

It can be derived by using the definition of characteristic function and a Taylor series expansion:

## Distribution function

The distribution function of a Gamma random variable iswhere the functionis called lower incomplete Gamma function and is usually evaluated using specialized computer algorithms.

Proof

This is proved as follows:

## More details

In the following subsections you can find more details about the Gamma distribution.

### The Gamma distribution is a scaled Chi-square distribution

If a variable has the Gamma distribution with parameters and , thenwhere has a Chi-square distribution with degrees of freedom.

Proof

For notational simplicity, denote by in what follows. Note that is a strictly increasing function of , since is strictly positive. Therefore, we can use the formula for the density of an increasing function of a continuous variable:The density function of a Chi-square random variable with degrees of freedom iswhere Therefore,which is the density of a Gamma distribution with parameters and .

Thus, the Chi-square distribution is a special case of the Gamma distribution because, when , we have

In other words, a Gamma distribution with parameters and is just a Chi square distribution with degrees of freedom.

### A Gamma random variable times a strictly positive constant is a Gamma random variable

By multiplying a Gamma random variable by a strictly positive constant, one obtains another Gamma random variable.

If is a Gamma random variable with parameters and , then the random variable defined ashas a Gamma distribution with parameters and .

Proof

This can be easily seen using the result from the previous subsection:where has a Chi-square distribution with degrees of freedom. Therefore,In other words, is equal to a Chi-square random variable with degrees of freedom, divided by and multiplied by . Therefore, it has a Gamma distribution with parameters and .

### A Gamma random variable is a sum of squared normal random variables

In the lecture on the Chi-square distribution, we have explained that a Chi-square random variable with degrees of freedom ( integer) can be written as a sum of squares of independent normal random variables , ..., having mean and variance :

In the previous subsections we have seen that a variable having a Gamma distribution with parameters and can be written aswhere has a Chi-square distribution with degrees of freedom.

Putting these two things together, we obtainwhere we have defined

But the variables are normal random variables with mean and variance .

Therefore, a Gamma random variable with parameters and can be seen as a sum of squares of independent normal random variables having zero mean and variance .

## Density plots

We now present some plots that help us to understand how the shape of the Gamma distribution changes when its parameters are changed.

### Plot 1 - Same mean but different degrees of freedom

The following plot contains two lines:

• the first one (red) is the pdf of a Gamma random variable with degrees of freedom and mean ;

• the second one (blue) is obtained by setting and .

Because in both cases, the two distributions have the same mean.

However, by increasing from to , the shape of the distribution changes. The more we increase the degrees of freedom, the more the pdf resembles that of a normal distribution.

The thin vertical lines indicate the means of the two distributions.

### Plot 2 - Different means but same number of degrees of freedom

In this plot:

• the first line (red) is the pdf of a Gamma random variable with degrees of freedom and mean ;

• the second one (blue) is obtained by setting and .

Increasing the parameter changes the mean of the distribution from to .

However, the two distributions have the same number of degrees of freedom (). Therefore, they have the same shape. One is the "stretched version of the other". It would look exactly the same on a different scale.

## Solved exercises

Below you can find some exercises with explained solutions.

### Exercise 1

Let and be two independent Chi-square random variables having and degrees of freedom respectively.

Consider the following random variables:

What distribution do they have?

Solution

Being multiples of Chi-square random variables, the variables , and all have a Gamma distribution. The random variable has degrees of freedom and the random variable can be written aswhere . Therefore has a Gamma distribution with parameters and . The random variable has degrees of freedom and the random variable can be written aswhere . Therefore has a Gamma distribution with parameters and . The random variable has a Chi-square distribution with degrees of freedom, because and are independent (see the lecture on the Chi-square distribution), and the random variable can be written aswhere . Therefore has a Gamma distribution with parameters and .

### Exercise 2

Let be a random variable having a Gamma distribution with parameters and .

Define the following random variables:

What distribution do these variables have?

Solution

Multiplying a Gamma random variable by a strictly positive constant one still obtains a Gamma random variable. In particular, the random variable is a Gamma random variable with parameters and The random variable is a Gamma random variable with parameters and The random variable is a Gamma random variable with parameters and The random variable is also a Chi-square random variable with degrees of freedom (remember that a Gamma random variable with parameters and is also a Chi-square random variable when ).

### Exercise 3

Let , and be mutually independent normal random variables having mean and variance .

Consider the random variable

What distribution does have?

Solution

The random variable can be written as where , and are mutually independent standard normal random variables. The sum has a Chi-square distribution with degrees of freedom (see the lecture entitled Chi-square distribution). Therefore has a Gamma distribution with parameters and .