If a formula can be found for the cumulative distribution function, F(x), of a continuous random variable, its probability density function can be obtained as its derivative.
The maximum likelihood estimator of the upper limit of a rectangular distribution from a random sample is the maximum of the sample values. Its cumulative distribution function can be found and differentiated to get the estimator's pdf. Its bias and standard error are determined from this.
The pdf of a monotonic function of X can be found directly from the pdf of X. The log-normal and Weibull distributions are given as examples.
Any continuous distribution can be expressed in terms of a Rectangular(0, 1) distribution (and vice versa) using the CDF and its inverse as transformations.
Random values from any distribution can be obtained from randomly generated Rectangular(0, 1) values using the inverse of the CDF as a transformation.
The mean and variance of Y = a + bX can be simply expressed in terms of those of X.
If X has a normal distribution, Y = a + bX is also normally distributed. In particular, Z = (X - μ) / σ has a standard normal distribution and this can be used to help find probabilities about X.
In some families of distributions, linear transformations result in other distributions from the same family.
Examples are given in which scale and location parameters are identified.
It is sometimes difficult to find the pdf of a transformed variable. This page gives an approximation to the transformed variable's mean and variance called the delta method. When applied to parameter estimates, the accuracy of the approximation improves as the sample size increases.
The delta method is used to find the approximate variance of the maximum likelihood estimator of the Geometric distribution's parameter. It is also used to get an approximate standard error for the odds of success in a binomial experiment.