Expected values

The definition of a random variable's mean can be generalised to give the expected value of an arbitrary function of the random variable.

Definition

The expected value of a function \(g(X)\) of a discrete random variable, \(X\), is defined to be

\[ E\big[g(X)\big] = \sum_{\text{all } x} {g(x) \times p(x)} \]

As with the definition of the variable's mean, this definition 'weights' the possible values, g(x), with their probabilities of arising.

The following two results make it easier to evaluate expected values.

Linear function of a random variable

If \(X\) is a discrete random variable and \(a\) and \(b\) are constants,

\[ E\big[a + b \times X\big] \;\;=\;\; a + b \times E[X] \]
\[ \begin{align} E\big[a + b \times X\big] & = \sum_{\text{all } x} {(a + b \times x) \times p(x)} \\[0.3em] & = \sum_{\text{all } x} {\left( a \times p(x) \; + \; b \times x \times p(x) \right)} \\[0.3em] & = a \times \sum_{\text{all } x} {p(x)} \; + \; b \times \sum_{\text{all } x} { x \times p(x)} \\ & = a \; + \; b \times E[X] \end{align} \]

since \(\sum {p(x)} = 1. \)

We can also easily find the expected value of the sum of two functions of \(X\).

Sum of two functions of X

If \(X\) is a discrete random variable and \(g(X)\) and \(h(X)\) are functions of it,

\[ E\big[g(X) + h(X)\big] \;\;=\;\; E\big[g(X)\big] + E\big[h(X)\big] \]
\[ \begin{align} E\big[g(X) + h(X)\big] & = \sum_{\text{all } x} {\big(g(X) + h(X)\big) \times p(x)} \\ & = \sum_{\text{all } x} {g(X) \times p(x)} \; + \; \sum_{\text{all } x} {h(X) \times p(x)} \\ & = E\big[g(X)\big] + E\big[h(X)\big] \end{align} \]