Earlier in this e-book, we defined independence of two random variables, \(X\) and \(Y\), to be when all events about \(X\) are independent of all events about \(Y\). For discrete random variables, this is equivalent to the joint probability function factorising into the marginal probability function of \(x\) times the marginal probability function of \(Y\),
Independence
Two discrete random variables, X, and Y, are independent if and only if
\[ p(x, y) \;\;=\;\; p_X(x) \times p_Y(y) \qquad \text{ for all } x \text{ and } y \]If \(X\) and \(Y\) are independent, then the conditional distribution of \(X\), given that \(Y = y\), does not depend on \(y\). Indeed, the conditional distributions of \(X\) are all equal to its marginal distribution.
\[ p_{X\mid Y=y}(x) \;\;=\;\; \frac {p(x,y)}{p_Y(y)} \;\;=\;\; \frac {p_X(x)p_Y(y)}{p_Y(y)} \;\;=\;\; p_X(x) \]for all \(x\) and \(y\). Knowing the value of \(Y\) therefore provides no information about the distribution of \(X\).
In a similar way, if \(X\) and \(Y\) are independent, then the conditional distributions of \(Y\) given that \(X = x\) are equal to the variable's marginal distribution.
\[ p_{Y\mid X=x}(y) \;\;=\;\; p_Y(y) \]