If \(X\) is a random variable with mean \(\mu_X\) and variance \(\sigma_X^2\) and \(Y\) is an independent random variable with mean \(\mu_Y\) and variance \(\sigma_Y^2\), then the mean and variance of their sum are

\[ E[X+Y] = \mu_X + \mu_Y \spaced{and} \Var(X+Y) = \sigma_X^2 + \sigma_Y^2 \]

If \(X_1\) and \(X_2\) are not independent, this formula for the mean of \(X_1 + X_2\) still holds, but the variance of the sum is different.

Variance of the sum of two variables

For any random variables, \(X\) and \(Y\),

\[ \Var(X+Y) \;=\; \Var(X) + \Var(Y) + 2\Covar(X,Y) \]

(Proved in full version)

This result can be extended to the sum of any number of random variables. We simply state it without proof.

General formula for variance of a sum

For any \(n\) random variables, \(X_1, \dots, X_n\),

\[ \Var(\sum_{i=1}^n X_i) \;=\; \sum_i \Var(X_i) + \sum_{i \ne j} \Covar(X_i,X_j) \]

Note that each of the covariances in the righthand summation appears twice since \(\Covar(X_i,X_j) = \Covar(X_j, X_i)\).