The characteristic function (cf) is a complex function that completely characterizes the distribution of a random variable.
Table of contents
The use of the characteristic function is almost identical to that of the moment generating function:
it can be used to easily derive the moments of a random variable;
it uniquely determines its associated probability distribution; it is often used to prove that two distributions are equal.
The cf has an important advantage over the moment generating function: while some random variables do not possess the latter, all random variables have a characteristic function.
We start this lecture with a definition of characteristic function.
Definition Let be a random variable. Let be the imaginary unit. The function defined byis called the characteristic function of .
The first thing to be noted is that exists for any . This can be proved as follows:and the last two expected values are well-defined, because the sine and cosine functions are bounded in the interval .
Like the moment generating function of a random variable, the characteristic function can be used to derive the moments of , as stated in the following proposition.
Proposition Let be a random variable and its cf. Let . If the -th moment of , denoted by , exists and is finite, then is times continuously differentiable andwhere is the -th derivative of with respect to , evaluated at the point .
The proof of this proposition is quite complex (see, e.g., Resnick 2013) and we give here only a sketch, without taking technical details into consideration. By virtue of the linearity of the expected value and of the derivative operator, the derivative can be brought inside the expected value, as follows:When , the latter becomes
In practice, the proposition above is not very useful when one wants to compute a moment because it requires to know in advance whether the moment exists or not.
A much more useful proposition is the following.
Proposition Let be a random variable and its characteristic function. If is times differentiable at the point , then
if is even, the -th moment of exists and is finite for any ;
if is odd, the -th moment of exists and is finite for any .
In both cases,where is the -th derivative of with respect to , evaluated at the point .
See, e.g., Ushakov (1999).
The next example shows how this proposition can be used to compute the second moment of an exponential random variable.
Example Let be an exponential random variable with parameter . Its support is the set of positive real numbers:and its probability density function isIts cf iswhich is proved in the lecture entitled Exponential distribution. Note that the division above does not pose any division-by-zero problem, because the denominator is different from also when (because ). The first derivative of the cf isThe second derivative of the cf isEvaluating it at , we obtainTherefore, the second moment of exists and is finite. Furthermore, it can be computed as
Characteristic functions, like moment generating functions, can also be used to characterize the distribution of a random variable.
Proposition Let and be two random variables. Denote by and their distribution functions and by and their cfs. Then, and have the same distribution, i.e., for any , if and only if they have the same cf, i.e., for any .
See, e.g., Resnick 2013.
In applications, this proposition is often used to prove that two distributions are equal, especially when it is too difficult to directly prove the equality of the two distribution functions and .
The following sections contain more details about the characteristic function.
Let be a random variable with cf .
Definewhere are two constants and .
Then, the cf of is
Using the definition of cf, we obtain
Let , ..., be mutually independent random variables.
Let be their sum:
Then, the cf of is the product of the cfs of , ..., :
It can be demonstrated as follows:
When is a discrete random variable with support and probability mass function , its cf is
Thus, the computation of the characteristic function is pretty straightforward: all we need to do is to sum the complex numbers over all values of belonging to the support of .
When is a continuous random variable with probability density function , its cf is
The right-hand side integral is a contour integral of a complex function along the real axis.
As people reading these lecture notes are usually not familiar with contour integration (a topic in complex analysis), we avoid it altogether and instead exploit the fact thatto rewrite the contour integral as the complex sum of two ordinary integrals:and to compute the two integrals separately.
The multivariate generalization of the cf is presented in the lecture on the joint characteristic function.
Below you can find some exercises with explained solutions.
Let be a discrete random variable having supportand probability mass function
Derive the characteristic function of .
By using the definition of characteristic function, we get
Use the characteristic function found in the previous exercise to derive the variance of .
We can use the following formula for computing the variance:The expected value of is computed by taking the first derivative of the characteristic function:evaluating it at and dividing it by :The second moment of is computed by taking the second derivative of the characteristic function:evaluating it at and dividing it by :Therefore,
Read and try to understand how the characteristic functions of the uniform and of the exponential distributions are derived in the lectures entitled Uniform distribution and Exponential distribution.
Resnick, S. I. (2013) A Probability Path, Birkhauser.
Ushakov, N. G. (1999) Selected topics in characteristic functions, VSP.
Please cite as:
Taboga, Marco (2021). "Characteristic function", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/characteristic-function.
Most of the learning materials found on this website are now available in a traditional textbook format.