Marginal and conditional probs can be found from joint probs (and vice versa)

We have used three types of probability to describe a model for two categorical variables — the joint probabilities, the marginal probabilities for the two variables and the conditional probabilities for each variable given the value of the other variable. These sets of probabilities are closely related. Indeed, the model can be equivalently described by any of the following.

Each can be found from the others:

Bayes theorem

In particular, note that it is possible to obtain the conditional probabilities for X given Y, px | y, from the marginal probabilities of X, px, and the conditional probabilities for Y given X, py | x. This can be expressed in a single formula that is called Bayes Theorem, but it is easier in practice to do the calculations in two steps, obtaining the joint probabilities, pxy, in the first step. There are several important applications of Bayes Theorem.

Accuracy of medical diagnostic tests

There are two types of error in a test for a medical condition:

Consider a diagnostic test with

p negative | disease  =  0.05           ppositive | no disease  =  0.10

From these, we can also write

p positive | disease  =  0.95           pnegative | no disease  =  0.90

We will also assume that 10% of people who are given the test have the disease,

p disease  =  0.10

From this information, we can find the probabilities of having the disease, given the result of the diagnostic test,