Two independent random variables

From the result on the previous page about linear functions of two independent variables,

\[\begin{align} E[X_1 + X_2] \;\;& =\;\; E[X_1] + E[X_2] \\[0.5em] \Var(X_1 + X_2) \;\;& =\;\; \Var(X_1) + \Var(X_2) \end{align} \]

If \(X_1\) and \(X_2\) also have the same distributions with mean \(\mu\) and variance \(\sigma^2\), then:

\[\begin{align} E[X_1 + X_2] \;\;& =\;\; 2\mu \\ \Var(X_1 + X_2) \;\;& =\;\; 2\sigma^2 \end{align} \]

Random sample

This extends to the sum of \(n\) independent identically distributed random variables — the sum of the values in a random sample.

Sum of values in a random sample

If \(\{X_1, X_2, ..., X_n\}\) is a random sample of \(n\) values from a discrete distribution with mean \(\mu\) and variance \(\sigma^2\), then the sum of the values, \(\sum_{i=1}^n {X_i}\) has mean and variance

\[\begin{align} E\left[\sum_{i=1}^n {X_i}\right] & = n\mu \\ \Var\left(\sum_{i=1}^n {X_i}\right) & = n\sigma^2 \end{align} \]

(Proved in full version)

Since the sample mean is simply the sum of the values divided by the constant \(n\), this result also provides us with formulae for the mean and variance of the sample mean.

Mean of a random sample

If \(\{X_1, X_2, ..., X_n\}\) is a random sample of \(n\) values from a discrete distribution with mean \(\mu\) and variance \(\sigma^2\), then the sample mean has a distribution with mean and variance

\[\begin{align} E[\overline{X}] \;\;& =\;\; \mu \\ \Var(\overline{X}) \;\;& =\;\; \frac {\sigma^2} n \end{align} \]

(Proved in full version)