 StatLect

Jensen's inequality

Jensens's inequality concerns the expected value of convex and concave transformations of a random variable. Statement

The following is a formal statement of the inequality.

Proposition Let be an integrable random variable. Let be a convex function such that is also integrable. Then, the following inequality, called Jensen's inequality, holds: Proof

A function is convex if, for any point the graph of lies entirely above its tangent at the point : where is the slope of the tangent. Setting and , the inequality becomes Taking the expected value of both sides of the inequality and using the fact that the expected value operator preserves inequalities, we obtain If the function is strictly convex and is not almost surely constant, then we have a strict inequality: Proof

A function is strictly convex if, for any point the graph of lies entirely above its tangent at the point (and stricly so for points different from ): where is the slope of the tangent. Setting and , the inequality becomes and, of course, when . Taking the expected value of both sides of the inequality and using the fact that the expected value operator preserves inequalities, we obtain where the first inequality is strict because we have assumed that is not almost surely constant and therefore the event does not have probability .

If the function is concave, then Proof

If is concave, then is convex and by Jensen's inequality: Multiplying both sides by and using the linearity of the expected value we obtain the result.

If the function is strictly concave and is not almost surely constant, then Proof

Similar to previous proof.

Example

Suppose a strictly positive random variable has expected value and it is not constant with probability one. What can we say about the expected value of , by using Jensen's inequality?

The natural logarithm is a strictly concave function, because its second derivative is strictly negative on its domain of definition.

As a consequence, by Jensen's inequality, we have Therefore, has a strictly negative expected value.

Exercise 1

Let be a strictly positive random variable, such that What can you infer, using Jensen's inequality, about the following expected value: Solution

The function has first derivative and second derivative The second derivative is strictly negative on the domain of definition of the function. Therefore, the function is strictly concave. Furthermore, is not almost surely constant, because it has strictly positive variance. Hence, by Jensen's inequality: The book

Most of the learning materials found on this website are now available in a traditional textbook format.

Glossary entries
Share