Two random variables, \(X\) and \(Y\), are independent when all events about \(X\) are independent of all events about \(Y\). The following result for continuous random variables is similar to that for discrete variables.

Independence

Two continuous random variables, X, and Y, are independent if and only if

\[ f(x, y) = f_X(x) \times f_Y(y) \qquad \text{ for all } x \text{ and } y \]

If \(X\) and \(Y\) are independent, then the conditional pdf of \(X\), given that \(Y = y\) is equal to its marginal pdf.

\[ f_{X\mid Y=y}(x) \;\;=\;\; \frac {f(x,y)}{f_Y(y)} \;\;=\;\; f_X(x) \]

If the variables are independent, knowing the value of \(Y\) gives no information about the distribution of \(X\). Similarly,

\[ f_{Y\mid X=x}(y) \;\;=\;\; f_Y(y) \]

Determining independence

We can sometimes deduce mathematically that two variables are independent by factorising their joint pdf, but independence is more often justified by the context from which the two variables were defined.

Failure of light bulbs

If two light bulbs are tested at 80ºC until failure, their failure times \(X\) and \(Y\) can be assumed to be independent — failure of one bulb would not influence when the other failed. If the distribution for a single light bulb is \(\ExponDistn(\lambda)\), there joint pdf would therefore be

\[ f(x, y) = f_X(x) \times f_Y(y) = \left(\lambda e^{\lambda x}\right)\left(\lambda e^{\lambda y}\right) = \lambda^2 e^{\lambda(x+y)}\qquad \text{ if } x \ge 0\text{ and } y \ge 0 \]

Extensions to 3 or more variables

The idea of a joint probability density function for three or more continuous random variables \(\{X_1,X_2,\dots, X_n\}\) is a simple extension of that for two variables,

\[ f(x_1, \dots, x_n) \]

Probabilities can be obtained from the joint pdf as multiple integrals over the corresponding values of the variables, \((x_1, \dots, x_n)\), but we will not go into the details here.

Random samples

A collection of \(n\) independent random variables with the same distribution is a random sample from the distribution. The joint pdf of the variables is then

\[ f(x_1, \dots, x_n) \;=\; \prod_{i=1}^n f(x_i) \]

where \(f(\cdot)\) is the pdf of the distribution from which the random sample is taken. When treated as a function of unknown parameters, this is the likelihood function for the sample data.