If one random variable is a monotonic function of another — steadily increasing or decreasing — the pdfs of the variables are closely related. (The proof of this result is based on finding the cumulative distribution function of the transformed variable.)
Monotonic function of X
If a continuous random variable \(X\) has probability density function \(f_X(x)\) and another variable is defined as \(Y = g(X)\) where the function \(g(\cdot)\) is a monotonic function with inverse \(X = h(Y)\), then the pdf of \(Y\) is
\[ f_Y(y) \;\;=\;\; f_X(h(y)) \times \left| h'(y) \right| \](Proved in full version)
The method will be clearer in an example.
Question: Log-normal distribution
If \(X \sim \NormalDistn(\mu,\; \sigma^2)\), show that the probability density function of \(Y = \exp(X)\) is
\[ f_Y(y) \;\;=\;\; \begin{cases} \displaystyle \frac 1{y\sqrt{2\pi}\;\sigma} e^{- \frac{\large (\log(y)-\mu)^2}{\large 2 \sigma^2}} & \text{if } y \ge 0 \\[0.4em] 0 & \text{otherwise} \end{cases} \](Solved in full version)
\(Y\) is said to have a log-normal distribution. If a random variable \(Y\) has a log-normal distribution, we can show in a similar way that \(X = \log(Y)\) will have a normal distribution.
The next example shows the relationship between the exponential and Weibull distributions.
Question
If \(X \sim \ExponDistn(\lambda)\), show that the distribution of \(Y = X^a\) has a \(\WeibullDistn(\diagfrac 1 a, \lambda^a)\) distribution.
(Solved in full version)