The results that we showed earlier about sums and means of discrete random variables also hold for variables with continuous distributions. We simply repeat them here.
Linear combination of independent variables
If the means of two independent random variables, \(X\) and \(Y\), are \(\mu_X\) and \(\mu_Y\) and their variances are \(\sigma_X^2\) and \(\sigma_Y^2\), then the linear combination \((aX + bY)\) has mean and variance
\[ \begin {align} E[aX + bY] & = a\mu_X + b\mu_Y \\[0.4em] \Var(aX + bY) & = a^2\sigma_X^2 + b^2\sigma_Y^2 \end {align} \]Sum of a random sample
If \(\{X_1, X_2, ..., X_n\}\) is a random sample of n values from any distribution with mean \(\mu\) and variance \(\sigma^2\), then the sum of the values has mean and variance
\[\begin{aligned} E\left[\sum_{i=1}^n {X_i}\right] & \;=\; n\mu \\ \Var\left(\sum_{i=1}^n {X_i}\right) & \;=\; n\sigma^2 \end{aligned} \]Sample mean
If \(\{X_1, X_2, ..., X_n\}\) is a random sample of n values from any distribution with mean \(\mu\) and variance \(\sigma^2\), then the sample mean has a distribution with mean and variance
\[\begin{aligned} E\big[\overline{X}\big] & \;=\; \mu \\ \Var\big(\overline{X}\big) & \;=\; \frac {\sigma^2} n \end{aligned} \]Central Limit Theorem (informal)
If \(\{X_1, X_2, ..., X_n\}\) is a random sample of n values from any distribution with mean \(\mu\) and variance \(\sigma^2\),
\[\begin{aligned} \sum_{i=1}^n {X_i} & \;\; \xrightarrow[n \rightarrow \infty]{} \;\; \NormalDistn(n\mu, \;\;\sigma_{\Sigma X}^2=n\sigma^2) \\ \overline{X} & \;\; \xrightarrow[n \rightarrow \infty]{} \; \; \NormalDistn(\mu, \;\;\sigma_{\overline X}^2 = \frac {\sigma^2} n) \end{aligned} \]