Search for probability and statistics terms on Statlect
StatLect

Cumulant generating function

by , PhD

The cumulant generating function of a random variable is the natural logarithm of its moment generating function.

The cumulant generating function is often used because it facilitates some calculations. In particular, its derivatives at zero, called cumulants, have interesting relations with moments and central moments.

Table of Contents

Review of mgf

Remember that the moment generating function (mgf) of a random variable X is defined as[eq1]provided that the expected value exists and is finite for all $t$ belonging to a closed interval [eq2], with $h>0$.

The mgf has the property that its derivatives at zero are equal to the moments of X:[eq3]

The existence of the mgf guarantees that the moments (hence the derivatives at zero) exist and are finite for every n.

Definition

The cumulant generating function (cgf) is defined as follows.

Definition Suppose that a random variable X possesses a moment generating function [eq4]. Then, the function[eq5]is the cumulant generating function of X.

Note that the cgf is well-defined since [eq6] is strictly positive for any $t$.

Since the mgf completely characterizes the distribution of a random variable and the natural logarithm is a one-to-one function, also the cumulant generating function completely characterizes the distribution of a random variable.

Cumulants

The derivatives of the cgf at zero are called cumulants.

The n-th cumulant is[eq7]

First three cumulants

The first cumulant is equal to the expected value:[eq8]

Proof

The first derivative of the cgf is[eq9]Since[eq10]we have[eq11]

The second cumulant is equal to the variance:[eq12]

Proof

The second derivative of the cgf is[eq13]When we evaluate it at $t=0$, we get[eq14]

The third cumulant is equal to the third central moment:[eq15]

Proof

The third derivative of the cgf is[eq16]When we evaluate it at $t=0$, we get[eq17]But[eq18]Therefore,[eq19]

The relation of higher cumulants to moments and central moments is more complicated.

Multivariate version

When X is a Kx1 random vector, the joint moment generating function of X is defined as[eq20]provided that the expected value exists and is finite for all Kx1 real vectors $t$ belonging to a closed rectangle H:[eq21]with $h_{i}>0$ for all $i=1,ldots ,K$.

The joint mgf has the property that[eq22]

The existence of the mgf guarantees the existence and finiteness of the cross-moments on the left-hand side of the equation.

Having reviewed the basic properties of the joint mgf, we are ready to define the joint cumulant generating function.

Definition Suppose that a random vector X possesses a joint moment generating function [eq23]. Then, the function[eq5]is the joint cumulant generating function of X.

The definition is basically the same given for random variables.

Cross-cumulants

The partial derivatives of the joint cgf at zero are called cross-cumulants (or joint cumulants).

Cross-cumulants are denoted as follows:[eq25]

Important cross-cumulants

First-order cross-cumulants are equal to the expected values of the entries of X:[eq26]

Proof

The first partial derivative of the joint cgf with respect to $t_{k}$ is[eq27]Since[eq28]we have[eq29]

Second-order joint cumulants are equal to the covariances between the entries of X:[eq30]

Proof

The first partial derivative of the joint cgf with respect to $t_{m}$ is[eq31]By taking the second derivative with respect to $t_{k}$, we obtain[eq32]Since[eq28]we have[eq34]

How to cite

Please cite as:

Taboga, Marco (2021). "Cumulant generating function", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/cumulant-generating-function.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.