Random variables defined from others

Any function of these two variables can be used to define another random variable.

\[Z =g(X, Y)\]

The shape of its distribution depends on those of \(X\) and \(Y\), but we will only consider its mean and variance here.

Independent variables

We now give two results that hold provided \(X\) and \(Y\) are independent. The first result is stated without proof here, but will be used later.

Product of independent random variables

If two discrete random variables, \(X\) and \(Y\), are independent,

\[ E[XY] = E[X] \times E[Y] \]

(Proved in full version)

More important in practice is a linear combination of \(X\) and \(Y\),

\(Z =aX + bY\)   where \(a\) and \(b\) are constants

Linear combination of independent variables

If the means of two independent discrete random variables, \(X\) and \(Y\), are \(\mu_X\) and \(\mu_Y\) and their variances are \(\sigma_X^2\) and \(\sigma_Y^2\), then the linear combination \((aX + bY)\) has mean and variance

\[ \begin {align} E[aX + bY] & = a\mu_X + b\mu_Y \\[0.4em] \Var(aX + bY) & = a^2\sigma_X^2 + b^2\sigma_Y^2 \end {align} \]

(Proved in full version)

Although the formula for the mean still holds if \(X\) and \(Y\) are not independent, the formula for the variance requires modification to cope with dependent random variables.