We start this lecture with a definition of variance.
Definition Let be a random variable. Denote by the expected value operator. The variance of , denoted by , is defined as follows:provided the above expected value exists and is well-defined.
The variance of is also called the second central moment of .
Variance is a measure of the dispersion of a random variable around its mean. Being the expected value of a squared number, variance is always positive. When a random variable is constant (whatever happens, it always takes on the same value), then its variance is zero (because is always equal to its expected value ). On the contrary, the larger are the possible deviations of from its expected value , the larger the variance of is.
To better understand how variance is computed, you can break up its computation in several steps:
compute , the expected value of ;
construct a random variable that measures how much the realizations of deviate from their expected value:
take the square of , so that positive and negative deviations from the mean having the same magnitude yield the same measure of distance from ;
finally, compute the expected value of the squared deviation to know how much on average deviates from :
The following is a very important formula for computing variance:
The above variance formula also makes clear that variance exists and is well-defined only as long as and exist and are well-defined.
The following example shows how to compute the variance of a discrete random variable using both the definition and the variance formula above.
Example Let be a discrete random variable with support and probability mass functionwhere . Its expected value isThe expected value of its square isIts variance isAlternatively, we can compute the variance of using the definition. Define a new random variable, the squared deviation of from , asThe support of is and its probability mass function isThe variance of equals the expected value of :
The exercises at the bottom of this page provide more examples of how variance is computed.
The following subsections contain more details on variance.
The square root of variance is called standard deviation. The standard deviation of a random variable is usually denoted by or by :
Let be a constant and let be a random variable. Then,
Thanks to the fact that (by linearity of the expected value), we have
Let be a constant and let be a random variable. Then,
Thanks to the fact that (by linearity of the expected value), we obtain
Let be two constants and let be a random variable. Then, combining the two properties above, one obtains
If exists and is finite, we say that is a square integrable random variable, or just that is square integrable. It can easily be proved that, if is square integrable then is also integrable, that is, exists and is finite. Therefore, if is square integrable, then, obviously, also its variance exists and is finite.
Below you can find some exercises with explained solutions.
Let be a discrete random variable with support and probability mass functionCompute its variance.
The expected value of isThe expected value of isThe variance of is
Let be a discrete random variable with support and probability mass functionCompute its variance.
The expected value of isThe expected value of isThe variance of is
Read and try to understand how the variance of a Poisson random variable is derived in the lecture entitled Poisson distribution.
Let be an absolutely continuous random variable with support and probability density functionCompute its variance.
The expected value of isThe expected value of isThe variance of is
Let be an absolutely continuous random variable with support and probability density functionCompute its variance.
The expected value of isThe expected value of isThe variance of is
Read and try to understand how the variance of a Chi-square random variable is derived in the lecture entitled Chi-square distribution.
Most learning materials found on this website are now available in a traditional textbook format.