Probabilities about a single variable

If the event of interest only involves one of the two variables, the double-integral simplifies considerably. For example,

\[ P(a \lt X \lt b) \;=\; \int_a^b \int_{-\infty}^{\infty} f(x,y) \;dy \; dx \;=\; \int_a^b f_X(x) \; dx \]

where

\[ f_X(x) \;=\; \int_{-\infty}^{\infty} f(x,y) \;dy \]

The function \(f_X(x)\) is called the marginal probability density function of \(X\). The marginal probability density function of \(Y\) is similarly defined as

\[ f_Y(y) \;=\; \int_{-\infty}^{\infty} f(x,y) \;dx \]

Marginal distributions

These marginal pdfs can be treated as univariate pdfs that characterise the distributions of each variable if nothing is known about the other variable. Probabilities about either variable can be found directly from its marginal distribution.

Although marginal pdfs are usually best found by integration, geometry can occasionally be used. The marginal pdf \(f_X(x)\) is the area of a cross-section through the joint pdf at \(x\), as illustrated by the following example.

Example

The random variables \(X\) and \(Y\) have joint probability density function

\[ f(x,y) \;=\; \begin{cases} x+y & \quad\text{if }0 \lt x \lt 1 \text{ and }0 \lt y \lt 1 \\ 0 & \quad\text{otherwise} \end{cases} \]

What is the marginal pdf of \(X\)?

The diagram below shows the joint pdf.

The marginal pdf of \(X\) at the value \(x\) is the area of a cross-section of the joint pdf at this value, and is the red area in the diagram. Using geometry, the area is

\[ f_X(x) \;=\; x + \frac 1 2 \qquad \text{for } 0 \lt x \lt 1 \]

You might like to verify that this is indeed a valid univariate probability function — it should integrate to one.