StatlectThe Digital Textbook
Index > Fundamentals of probability

Conditional probability

Let Omega be a sample space and let [eq1] denote the probability assigned to the events $Esubseteq Omega $. Suppose that, after assigning probabilites [eq2] to the events in Omega, we receive new information about the things that will happen (the possible outcomes). In particular, suppose that we are told that the realized outcome will belong to a set $Isubseteq Omega $. How should we revise the probabilities assigned to the events in Omega, to properly take the new information into account?

Denote by [eq3] the revised probability assigned to an event $Esubseteq Omega $ after learning that the realized outcome will be an element of I. [eq4] is called the conditional probability of E given I.

Despite being an intuitive concept, conditional probability is quite difficult to define in a rigorous way. We take a gradual approach in this lecture. We first discuss conditional probability for the very special case in which all the sample points are equally likely. We then give a more general definition. Finally, we refer the reader to other lectures where conditional probability is defined in even more abstract ways.

The case of equally likely sample points

Suppose a sample space Omega has a finite number n of sample points [eq5], i.e.:[eq6]Suppose also that each sample point is assigned the same probability:[eq7]In such a simple space, the probability of a generic event E is obtained as:[eq8]where $QTR{rm}{card}$ denotes the cardinality of a set, i.e. the number of its elements. In other words, the probability of an event E is obtained in two steps:

  1. counting the number of 'cases that are favorable to the event E', i.e. the number of elements $omega _{i}$ belonging to E;

  2. dividing the number thus obtained by the number of 'all possible cases', i.e. the number of elements $omega _{i}$ belonging to Omega.

For example, if [eq9][eq10]

When we learn that the realized outcome will belong to a set $Isubseteq Omega $, we still apply the rule:[eq11]

However, the number of all possible cases is now equal to the number of elements of I, because only the outcomes beloning to I are still possible. Furthermore, the number of favorable cases is now equal to the number of elements of $Ecap I$, because the outcomes in $Ecap I^{c}$ are no longer possible. As a consequence:[eq12]

Dividing numerator and denominator by [eq13] one obtains:[eq14]

Therefore, when all sample points are equally likely, conditional probabilities are computed as:[eq15]

Example Suppose that we toss a die. Six numbers (from 1 to $6)$ can appear face up, but we do not yet know which one of them will appear. The sample space is:[eq16]Each of the six numbers is a sample point and is assigned probability $frac{1}{6}$. Define the event E as follows:[eq17]where the event E could be described as 'an odd number appears face up'. Now define the event I as follows:[eq18]where the event I could be described as 'a number greater than three appears face up'. The probability of I is:[eq19]Suppose we are told that the realized outcome will belong to I. How do we have to revise our assessment of the probability of the event E, according to the rules of conditional probability? First of all, we need to compute the probability of the event $Ecap I$:[eq20]Then, the conditional probability of E given I is:[eq21]

In the next section, we will show that the conditional probability formula[eq15]is valid also for more general cases (i.e. when the sample points are not all equally likely). However, this formula already allows us to understand why defining conditional probability is a challenging task. In the conditional probability formula, a division by [eq23] is performed. This division is impossible when I is a zero-probability event (i.e. [eq24]). If we want to be able to define [eq25] also when [eq26], then we need to give a more complicated definition of conditional probability. We will return to this point later.

A more general approach

In this section we give a more general definition of conditional probability, by taking an axiomatic approach. First, we list the properties that we would like conditional probability to satisfy. Then, we prove that the conditional probability formula introduced above satisfies these properties. The discussion of the case in which the conditional probability formula cannot be used because [eq27] is postponed to the next section.

The conditional probability [eq28] is required to satisfy the following properties:

  1. Probability measure. [eq29] has to satisfy all the properties of a probability measure.

  2. Sure thing. [eq30].

  3. Impossible events. If $Esubseteq I^{c}$ ($I^{c}$, the complement of I with respect to Omega, is the set of all elements of Omega that do not belong to I), then [eq31].

  4. Constant likelihood ratios on I. If $Esubseteq I$, $Fsubseteq I$ and [eq32], then:[eq33]

These properties are very intutitve:

  1. Probability measure. This property requires that also conditional probability measures satisfy the fundamental properties that any other probability measure needs to satisfy.

  2. Sure thing. This property says that the probability of a sure thing must be 1: since we know that only things belonging to the set I can happen, then the probability of I must be 1.

  3. Impossible events. This property says that the probability of an impossible thing must be 0: since we know that things not belonging to the set I will not happen, then the probability of the events that are disjoint from I must be 0.

  4. Constant likelihood ratios on I. This property is a bit more complex: it says that if $Fsubseteq I$ is - say - two times more likely than $Esubseteq I$ before receiving the information I, then F remains two times more likely than E, also after reiceiving the information, because all the things in E and F remain possible (can still happen) and, hence, there is no reason to expect that the ratio of their likelihoods changes.

It is possible to prove that:

Proposition (Conditional probability formula) Whenever [eq34], [eq35] satisfies the four above properties if and only if:[eq15]

Proof

We first show that[eq37]satisfies the four properties whenever [eq38]. As far as property 1) is concerned, we have to check that the three requirements for a probabilitiy measure are satisfied. The first requirement for a probability measure is that [eq39]. Since [eq40], by the monotonicity of probability we have that:[eq41]hence:[eq42]Furthermore, since [eq43] and [eq44], also [eq45]The second requirement for a probability measure is that [eq46]. This is satisfied because:[eq47]The third requirement for a probability measure is that for any sequence of disjoint sets [eq48] the following holds:[eq49]But:[eq50]so that also the third requirement is satisfied. Property 2) is trivially satisfied:[eq51]Property 3) is verified because, if $Esubseteq I^{c}$, then:[eq52]Property 4) is verified because, if $Esubseteq I$, $Fsubseteq I$ and [eq53], then:[eq54]So, the 'if' part has been proved. Now we prove the 'only if' part. We prove it by contradiction. Suppose there exist another conditional probability [eq55] that satisfies the four properties. Then, there exists an event E, such that[eq56]It can not be that $Esubseteq I$, otherwise we would have:[eq57]which would be a contradiction, since if [eq58] was a conditional probability it would satisfy:[eq59]If $E $is not a subset of I then [eq60] implies also [eq61], because:[eq62]and[eq63]but this would also lead to a contradiction, because [eq64].

Tackling division by zero

In the previous section we have generalized the concept of conditional probability. However, we have not been able to define the conditional probability [eq28] for the case in which [eq27]. This case is discussed in the lectures entitled Conditional probability as a random variable and Conditional probability distributions.

More details

The law of total probability

Let $I_{1}$, ..., $I_{n}$ be n events having the following characteristics:

  1. they are mutually disjoint: [eq67] whenever $j
eq k$;

  2. they cover all the sample space:[eq68]

  3. they have strictly positive probability: [eq69] for any $j$.

$I_{1}$, ..., $I_{n}$ is a partition of Omega.

The law of total probability states that, for any event E, the following holds:[eq70]which can, of course, also be written as:[eq71]

Proof

The law of total probability is proved as follows:[eq72]

Solved exercises

Some solved exercises on conditional probability can be found below:

  1. Exercise set 1 (computation of conditional probability)

The book

Most learning materials found on this website are now available in a traditional textbook format.