<< Chapter < Page
  Random processes   Page 1 / 1
Chapter >> Page >
This module introduces conditional probabilities and Bayes' rule.

If A and B are two separate but possibly dependent random events, then:

  • Probability of A and B occurring together = , A B
  • The conditional probability of A , given that B occurs = B A
  • The conditional probability of B , given that A occurs = A B
From elementary rules of probability (Venn diagrams):
, A B B A B A B A
Dividing the right-hand pair of expressions by B gives Bayes' rule:
B A A B A B
In problems of probabilistic inference, we are often trying to estimate the most probable underlying model for a randomprocess, based on some observed data or evidence. If A represents a given set of model parameters, and B represents the set of observed data values, then the terms in are given the following terminology:
  • A is the prior probability of the model A (in the absence of any evidence);
  • B is the probability of the evidence B ;
  • A B is the likelihood that the evidence B was produced, given that the model was A ;
  • B A is the posterior probability of the model being A , given that the evidence is B .
Quite often, we try to find the model A which maximizes the posterior B A . This is known as maximum a posteriori or MAP model selection.

The following example illustrates the concepts of Bayesian model selection.

Loaded dice

Problem:

Given a tub containing 100 six-sided dice, in which one die is known to be loaded towards the six to a specified extent,derive an expression for the probability that, after a given set of throws, an arbitrarily chosen die is the loaded one?Assume the other 99 dice are all fair (not loaded in any way). The loaded die is known to have the following pmf: p L 1 0.05 p L 2 p L 5 0.15 p L 6 0.35 Here derive a good strategy for finding the loaded die from the tub.

Solution:

The pmfs of the fair dice may be assumed to be: i i 1 6 p F i 1 6 Let each die have one of two states, S L if it is loaded and S F if it is fair. These are our two possible models for the random process and they have underlying pmfs given by p L 1 p L 6 and p F 1 p F 6 respectively.

After N throws of the chosen die, let the sequence of throws be Θ N θ 1 θ N , where each θ i 1 6 . This is our evidence .

We shall now calculate the probability that this die is the loaded one. We therefore wish to find the posterior Θ N S L .

We cannot evaluate this directly, but we can evaluate the likelihoods , S L Θ N and S F Θ N , since we know the expected pmfs in each case. We also know the prior probabilities S L and S F before we have carried out any throws, and these are 0.01 0.99 since only one die in the tub of 100 is loaded. Hence we can use Bayes' rule:

Θ N S L S L Θ N S L Θ N
The denominator term Θ N is there to ensure that Θ N S L and Θ N S F sum to unity (as they must). It can most easily be calculated from:
Θ N , Θ N S L , Θ N S F S L Θ N S L S F Θ N S F
so that
Θ N S L S L Θ N S L S L Θ N S L S F Θ N S F 1 1 R N
where
R N S F Θ N S F S L Θ N S L
To calculate the likelihoods, S L Θ N and S F Θ N , we simply take the product of the probabilities of each throw occurring in the sequence of throws Θ N , given each of the two modules respectively (since each new throw is independent of all previous throws, giventhe model). So, after N throws, these likelihoods will be given by:
S L Θ N i 1 N p L θ i
and
S F Θ N i 1 N p F θ i
We can now substitute these probabilities into the above expression for R N and include S L 0.01 and S F 0.99 to get the desired a posteriori probability Θ N S L after N throws using .

We may calculate this iteratively by noting that

S L Θ N S L Θ N - 1 p L θ n
and
S F Θ N S F Θ N - 1 p F θ n
so that
R N R N - 1 p F θ n p L θ n
where R 0 S F S L 99 . If we calculate this after every throw of the current die being tested (i.e. as N increases), then we can either move on to test the next die from the tub if Θ N S L becomes sufficiently small (say < 10 -4 ) or accept the current die as the loaded one when Θ N S L becomes large enough (say > 0.995 ). (These thresholds correspond approximately to R N 10 4 and R N 5 -3 respectively.)

The choice of these thresholds for Θ N S L is a function of the desired tradeoff between speed of searching versus the probability of failure to findthe loaded die, either by moving on to the next die even when the current one is loaded, or by selecting a fair dieas the loaded one.

The lower threshold, p 1 10 -4 , is the more critical, because it affects how long we spend before discarding each fair die. The probability ofcorrectly detecting all the fair dice before the loaded die is reached is 1 p 1 n 1 n p 1 , where n 50 is the expected number of fair dice tested before the loaded one is found. So the failure probability due toincorrectly assuming the loaded die to be fair is approximately n p 1 0.005 .

The upper threshold, p 2 0.995 , is much less critical on search speed, since the loaded result only occurs once, so it is a good idea to set it very close to unity. The failureprobability caused by selecting a fair die to be the loaded one is just 1 p 2 0.005 . Hence the overall failure probability  0.005 0.005 0.01

In problems with significant amounts of evidence (e.g. large N ), the evidence probability and the likelihoods can both get very very small, sufficientto cause floating-point underflow on many computers if equations such as and are computed directly. However the ratio of likelihood to evidenceprobability still remains a reasonable size and is an important quantity which must be calculatedcorrectly.

One solution to this problem is to compute only the ratio of likelihoods, as in . A more generally useful solution is to computelog(likelihoods) instead. The product operations in the expressions for the likelihoods then become sums oflogarithms. Even the calculation of likelihood ratios such as R N and comparison with appropriate thresholds can be done in the log domain. After this, it is OK to return tothe linear domain if necessary since R N should be a reasonable value as it is the ratio of very small quantities.

Probabilities of the current die being the loaded one as the throws progress (20th die is the loaded one). A newdie is selected whenever the probability falls below p 1 .
Histograms of the dice throws as the throws progress. Histograms are reset when each new die is selected.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Random processes. OpenStax CNX. Jan 22, 2004 Download for free at http://cnx.org/content/col10204/1.3
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Random processes' conversation and receive update notifications?

Ask