Since a uniform distribution's probability function is defined by a mathematical function (as opposed to a list of values), we can find its mean and variance by summing series.

Uniform distribution's mean and variance

If \(X \sim \UniformDistn(a, b) \), its mean and variance are

\[ \begin {align} E[X] & = \frac {a+b} 2 \\ \Var(X) & = \frac { (b - a + 1)^2 - 1} {12} \end {align} \]

The following proof is only provided for completeness. The details are not important since the proof involves summing series that will not be used elsewhere in the e-book.

We first find the mean and variance when the constant \(a\) is zero, \(Y \sim \UniformDistn(0, k) \) with probability function

\[ p_Y(y) = \frac 1 {k+1}\]

Its mean is

\[ \begin {align} E[Y] & = \sum_{y=0}^k {\frac y {k+1}} \\ & = \frac 1 {k+1} \sum_{y=0}^k y \\ & = \frac 1 {k+1} \times \frac {k(k+1)} 2 \quad \quad \text{ from summing an arithmetic series} \\[0.4em] & = \frac k 2 \end {align} \]

In a similar way,

\[ \begin {align} E[Y^2] & = \sum_{y=0}^k {\frac {y^2} {k+1}} \\ & = \frac 1 {k+1} \sum_{y=0}^k {y^2} \\ & = \frac 1 {k+1} \times \frac {k(k+1)(2k+1)} 6 \quad \quad \text{ summing another }`\text{standard' series} \\[0.4em] & = \frac {k(2k+1)} 6 \end {align} \]

Therefore

\[ \begin {align} \Var(Y) \;&=\; E[Y^2] - \left(E[Y]\right)^2 \\[0.4em] & =\; \frac {k(2k+1)} 6 - \left(\frac k 2 \right)^2 \\ & = \frac {2k(2k+1) - 3k^2} {12} \\ & =\; \frac {k^2 + 2k} {12} \\[0.4em] & =\; \frac {(k+1)^2 - 1} {12} \end {align} \]

Since \(Y\) is a \(\UniformDistn(0, k) \) variable, \(X\) has the same distribution as \((Y + a)\) where \(k = (b − a)\). We showed earlier that \((Y + a)\) has a mean that is \(a\) greater than that of \(Y\) but their variances are the same, so

\[ E[X] \;=\; E[Y] + a \;=\; \frac {b-a} 2 + a \;=\; \frac {a+b} 2 \\ \Var(X) \;=\; \Var(Y) \;=\; \frac {(b-a+1)^2 - 1)} {12} \]