Long page
descriptions

Chapter 12   Multivariate distributions

12.1   Discrete bivariate distributions

12.1.1   Joint probability function

The joint probability function of X and Y gives the probabilities for all possible combinations of the variables' values. Probabilities for events can be found by summing the joint probability function.

12.1.2   Three-dimensional bar chart

The joint probability function of X and Y can be displayed graphically in a 3-dimensional bar chart.

12.1.3   Marginal distributions

The marginal probability function for one variable gives probabilities for its possible values. It can be found by summing the joint probability function over the values of the other variable.

12.1.4   Conditional distributions

The conditional probabilities for one variable, given the value of the other, form a univariate distribution. They are the ratio of the joint probability function to the second variable's marginal one.

12.1.5   Independence

Two random variables are independent if their joint probability function equals the product of the two marginal ones.

12.1.6   Random samples

Independence can be generalised to n variables. If they all have the same distribution, they are a random sample.

12.2   Continuous bivariate distributions

12.2.1   Joint probability density function

Joint probability density functions describe the distributions of pairs of continuous random variables. Probabilities are defined as volumes under it.

12.2.2   Probabilities as integrals

Probabilities can be found as double integrals of the joint probability density function.

12.2.3   Marginal distributions

The marginal distribution of one of the two variables is a univariate distribution whose pdf can be found by integrating the joint pdf over the other variable.

12.2.4   Conditional distributions

The conditional probability density function of one variable given the value of the other is the joint pdf divided by the marginal pdf of the other variable.

12.2.5   Independence and random samples

As with discrete variables, continuous variables are independent if their joint pdf is the product of the two marginal pdfs. A random sample is a collection of n independent variables with the same distribution.

12.3   Expected values

12.3.1   Discrete expected values

The expected value of a function, g(X,Y), of two discrete random variables, X and Y, is the sum of the possible values of the function with each weighted by its probability — the sum of g(x,y)p(x,y) over all pairs (x,y).

12.3.2   Continuous expected values

When X and Y are continuous, the expected value of g(X,Y) is similarly defined but with a double integral of g(x,y)f(x,y) replacing the double summation.

12.3.3   Properties of expected values

Some properties of expected values are described. Conditional expected values are also defined here, and the page explains how to obtain unconditional expected values from conditional ones.

12.3.4   Means and variances

The means and variances of individual variables can be found as expected values from their marginal distributions.

12.4   Covariance and correlation

12.4.1   Covariance

This page defines the covariance of two variables and gives some properties.

12.4.2   Variance of a sum

The variance of the sum of two independent variables is the sum of their variances. This formula must be modified if the variables are correlated.

12.4.3   Correlation coefficient

The correlation coefficient of X and Y is defined. It is unaffected by linear transformations of the two variables.

12.4.4   Linear relationships

The correlation coefficient must lie between –1 and +1. The values ±1 can arise if and only if the two variables are exactly linearly related.

12.4.5   Independence

If two variables are independent, their covariance and correlation are both zero.

12.5   Multinomial distribution

12.5.1   Joint probability function

The multinomial distribution is a generalisation of the binomial distribution. It describes situations where each independent trial may have more than two possible outcomes.

12.5.2   Marginal distributions

The marginal distributions of single variables are all binomial if their joint distribution distribution is multinomial.

12.5.3   Conditional distributions

The conditional distribution of any subset of variables that have a multinomial distribution is also multinomial.

12.5.4   Means, variances and correlations

The means and variances of individual variables that have a joint multinomial distribution can be found from their marginal binomial distributions. Formulae for the covariance and correlation of two of the variables are derived.

12.5.5   Parameter estimation

The maximum likelihood estimates of the multinomial probabilities are the corresponding sample proportions.

12.6   Bivariate normal distribution

12.6.1   Standard bivariate normal distribution

The standard bivariate normal distribution has a single parameter, that affects the strength of the relationship between the variables. The joint pdf is given and the distribution's shape is described.

12.6.2   General bivariate normal distribution

The general bivariate normal distribution is a generalisation with five parameters. Linear transformations of standard bivariate normal variables have this distribution.

12.6.3   Marginal distributions

The marginal distributions are univariate normal distributions and their means and variances are four of the distribution's five parameters.

12.6.4   Conditional distributions

The conditional distribution of Y, given that X=x, is also a univariate normal distribution. Its shape is that of a slice through the joint pdf at x.

12.6.5   Covariance and correlation

A formula is derived for the covariance of the two variables and their correlation is shown to be the fifth parameter of the bivariate normal distribution.