Means and variances of \(X\) and \(Y\)

We now use the general definition of expected values to find the expected value of \(X\) — its mean.

\[ \begin{align} E[X] \;=\; \mu_X \;=\; \sum_{\text{all }x} {\sum_{\text{all }y}{x \times p(x,y)}} \;&=\; \sum_{\text{all }x} {x \times \sum_{\text{all }y}{p(x,y)}} \\ &=\; \sum_{\text{all }x} {x \times p_X(x)} \end{align} \]

where \(p_X(x)\) is the marginal probability function of X. The mean of \(X\) is therefore the same as would be obtained directly from its marginal distribution. (Integration replaces summation for continuous random variables.)

In a similar way,

\[ \Var(X) \;=\; E\left[(X-\mu_X)^2\right] \;=\; \sum_{\text{all }x} {(x-\mu_x)^2 \times p_X(x)} \]

is the same as would be obtained from the marginal distribution. The same results hold for the mean and variance of \(Y\).