In the lecture entitled Characteristic function we have introduced the concept of characteristic function (cf) of a random variable. This lecture is about the joint cf, a concept which is analogous, but applies to random vectors.
Definition Let be a random vector. The joint characteristic function of is a function defined bywhere is the imaginary unit.
Observe that exists for any becauseand the expected values appearing in the last line are well-defined, because both the sine and the cosine are bounded (they take values in the interval ).
Proposition Let be a random vector and its joint characteristic function. Let . Define a cross-moment of order as follows:where and . If all cross-moments of order exist and are finite, then all the -th order partial derivatives of exist and where the partial derivative on the right-hand side of the equation is evaluated at the point , , ..., .
When we need to derive a cross-moment of a random vector, the practical usefulness of this proposition is somewhat limited, because it is seldom known, a priori, whether cross-moments of a given order exist or not. The following proposition, instead, does not require such a priori knowledge.
Proposition Let be a random vector and its joint cf. If all the -th order partial derivatives of exist, then
if is even, for any all -th cross-moments of exist and are finite;
if is odd, for any all -th cross-moments of exist and are finite.
In both cases, we have thatwhere the partial derivatives on the right-hand sides of the equations above are evaluated at the point , , ..., .
Again, see Ushakov (1999).
The joint cf can also be used to check whether two random vectors have the same distribution.
Proposition Let and be two random vectors. Denote by and their joint distribution functions and by and their joint cfs. Then,
Stated differently, two random vectors have the same distribution if and only if they have the same joint cf. This result is frequently used in applications, because demonstrating equality of two joint cfs is often much easier than demonstrating equality of two joint distribution functions.
The following sections contain more detail about the joint characteristic function.
Let be a random vector with characteristic function . Definewhere is a constant vector and is a constant matrix. Then, the joint cf of is
This is proved as follows:
Let be a random vector. Let its entries , ..., be mutually independent random variables. Denote the cf of the -th entry of by .
Then, the joint cf of is
This is demonstrated as follows:
Let , ..., be mutually independent random vectors. Let be their sum:Then, the joint cf of is the product of the joint cfs of , ..., :
Similar to the previous proof:
Some solved exercises on joint characteristic functions can be found below.
Let and be two independent standard normal random variables. Let be a random vector whose components are defined as follows:Derive the joint characteristic function of .
Hint: use the fact that and are two independent Chi-square random variables having characteristic function
By using the definition of characteristic function, we get
Use the joint characteristic function found in the previous exercise to derive the expected value and the covariance matrix of .
We need to compute the partial derivatives of the joint characteristic function:All partial derivatives up to the second order exist and are well defined. As a consequence, all cross-moments up to the second order exist and are finite and they can be computed from the above partial derivatives:The covariances are derived as follows:So, summing up, we get
Read and try to understand how the joint characteristic function of the multinomial distribution is derived in the lecture entitled Multinomial distribution.
Ushakov, N. G. (1999) Selected topics in characteristic functions, VSP.
Most learning materials found on this website are now available in a traditional textbook format.