Long page
descriptions

Chapter 5   Continuous Distributions

5.1   Finding probabilities

5.1.1   Probabilities by integration

The probability of a value within any range is the area under the pdf above it. This can be found by integrating the pdf.

5.1.2   Rectangular distribution

The simplest continuous distribution is a rectangular one in which each value between two constants is equally likely. Probabilities can be found by geometry or integration.

5.1.3   Other examples

Finding probabilities by integration is illustrated with another two examples.

5.1.4   Cumulative distribution function

The cumulative distribution function, F(x), is the probability of value less than or equal to x.

5.1.5   Quantiles

The p'th quantile of a distribution is the value x such that the probability of a value less than x is p.

5.2   Mean and variance

5.2.1   Expected values

The expected value of any function of a continuous random variable is defined by an integral.

5.2.2   Mean and variance

The expected value of X is called its mean. The variable's variance is the expected value of the squared difference between X and its mean.

5.2.3   Example

The mean and variance of a rectangular distribution are derived.

5.3   Random samples

5.3.1   Independence and random samples

Two continuous random variables are independent if knowing the value of one provides no information about the value of the other. A collection of n independent random variables with the same distribution is a random sample.

5.3.2   Distribution of sample sum and mean

Formulae are given for the mean and variance of the sum and mean of a random sample. The sum and mean are both approximately normal in large samples.

5.4   Estimating parameters

5.4.1   Bias and standard error

Bias, standard error, mean squared error and consistency are defined in the same way for estimators of parameters in discrete and continuous distributions.

5.4.2   Method of moments

If a distribution has a single unknown parameter, its method of moments estimator is the value that makes the distribution's mean equal to the mean of a random sample. The method of moments estimator for the maximum of a rectangular distribution is found.

5.4.3   Maximum likelihood

The likelihood function of a continuous distribution's parameter is the product of the probability density functions of a random sample. The maximum likelihood estimate is usually found by setting the derivative of the log-likelihood to zero.

5.4.4   Properties of maximum likelihood estimators

Maximum likelihood estimators are asymptotically unbiased with normal distributions. A formula is given for the approximate standard error.

5.4.5   Confidence intervals

An approximate 95% confidence interval is the maximum likelihood estimate ± 1.96 standard errors. Replacing "1.96" with other constants gives other confidence levels.

5.4.6   Example: normal distribution mean

The maximum likelihood estimate of a normal distribution's mean is found. The asymptotic formula for the standard error of the estimator gives the exact standard error.

5.4.7   Example: Rectangular maximum

The maximum likelihood estimate of the maximum of a rectangular distribution is at a discontinuity of the likelihood function and cannot be found by differentiating the log-likelihood.