Bayes' rule, named after the English mathematician Thomas Bayes, is a rule for computing conditional probabilities.
A formal statement of Bayes' rule follows.
Proposition
      Let
      
      and
      
      be two  events. Denote their probabilities
      by
      
      and
      
      and suppose that both
      
      and
      
.
      Denote by
      
      the  conditional probability of
      
      given
      
      and by
      
      the conditional probability of
      
      given
      
.
      Bayes' rule states
      that
   
By the conditional probability formula, we
   have
   thatand
The
   second formula can be re-arranged as
   follows:
which
   is then plugged into the first formula, so as to
   obtain
The following example shows how Bayes' rule can be applied in a practical situation.
We built a robot that can detect defective items produced in our factory:
if an item is defective, it is spotted with 98% probability by the robot;
when an item is not defective, the robot will not signal any defect with 99% probability.
We draw an item at random from a production lot in which 0,1% of items are defective.
If the robot tells us that the drawn item is defective, what is the probability that the robot is right?
   In probabilistic terms, what we know about this problem can be formalized as
   follows:![[eq12]](/images/Bayes-rule__18.png) 
   Furthermore, the unconditional probability that the robot signals a defective
   item can be derived using the  law of total
   probability:![[eq13]](/images/Bayes-rule__19.png) 
   Therefore, Bayes' rule
   gives![[eq14]](/images/Bayes-rule__20.png) 
Even if the robot is conditionally very accurate, the unconditional probability that the robot is right when he says that an item is defective is less than 10 per cent!
   The quantities involved in Bayes'
   ruleoften
   take the following names:
         
         is called prior probability or, simply,
         prior;
      
         
         is called conditional probability or
         likelihood;
      
         
         is called marginal probability;
      
         
         is called posterior probability or, simply,
         posterior.
      
Below you can find some problems with explained solutions.
There are two urns containing colored balls. The first urn contains 50 red balls and 50 blue balls. The second urn contains 30 red balls and 70 blue balls. One of the two urns is randomly chosen (both urns have probability 50% of being chosen) and then a ball is drawn at random from one of the two urns. If a red ball is drawn, what is the probability that it comes from the first urn?
In probabilistic terms, what we know
   about this problem can be formalized as
   follows:The
   unconditional probability of drawing a red ball can be derived using the law
   of total
   probability:
![[eq21]](/images/Bayes-rule__27.png) By
   using Bayes' rule, we
   obtain
By
   using Bayes' rule, we
   obtain
An economics consulting firm has created a model to predict recessions. The model predicts a recession with probability 80% when a recession is indeed coming and with probability 10% when no recession is coming. The unconditional probability of falling into a recession is 20%. If the model predicts a recession, what is the probability that a recession will indeed come?
What we know about this problem can be
   formalized as
   follows:![[eq23]](/images/Bayes-rule__29.png) The
   unconditional probability of predicting a recession can be derived by using
   the law of total
   probability:
The
   unconditional probability of predicting a recession can be derived by using
   the law of total
   probability:![[eq24]](/images/Bayes-rule__30.png) Bayes'
   rule
   implies
Bayes'
   rule
   implies![[eq25]](/images/Bayes-rule__31.png) 
Alice has two coins in her pocket, a fair coin (head on one side and tail on the other side) and a two-headed coin. She picks one at random from her pocket, tosses it and obtains head. What is the probability that she flipped the fair coin?
What we know about this problem can be
   formalized as
   follows:The
   unconditional probability of obtaining head can be derived by using the law of
   total
   probability:
![[eq27]](/images/Bayes-rule__33.png) With
   Bayes' rule, we
   obtain
With
   Bayes' rule, we
   obtain![[eq28]](/images/Bayes-rule__34.png) 
If you want to know more about Bayes' rule and how it is used, you can read the following pages:
prior probability, where you can find another example and an introduction to the concept of prior distribution;
posterior probability, which also contains an example;
Bayesian inference, which introduces an important branch of statistics entirely based on Bayes' rule.
Please cite as:
Taboga, Marco (2021). "Bayes' rule", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/Bayes-rule.
Most of the learning materials found on this website are now available in a traditional textbook format.