The joint probability function of X and Y gives the probabilities for all possible combinations of the variables' values. Probabilities for events can be found by summing the joint probability function.
The joint probability function of X and Y can be displayed graphically in a 3-dimensional bar chart.
The marginal probability function for one variable gives probabilities for its possible values. It can be found by summing the joint probability function over the values of the other variable.
The conditional probabilities for one variable, given the value of the other, form a univariate distribution. They are the ratio of the joint probability function to the second variable's marginal one.
Two random variables are independent if their joint probability function equals the product of the two marginal ones.
Independence can be generalised to n variables. If they all have the same distribution, they are a random sample.
Joint probability density functions describe the distributions of pairs of continuous random variables. Probabilities are defined as volumes under it.
Probabilities can be found as double integrals of the joint probability density function.
The marginal distribution of one of the two variables is a univariate distribution whose pdf can be found by integrating the joint pdf over the other variable.
The conditional probability density function of one variable given the value of the other is the joint pdf divided by the marginal pdf of the other variable.
As with discrete variables, continuous variables are independent if their joint pdf is the product of the two marginal pdfs. A random sample is a collection of n independent variables with the same distribution.
The expected value of a function, g(X,Y), of two discrete random variables, X and Y, is the sum of the possible values of the function with each weighted by its probability — the sum of g(x,y)p(x,y) over all pairs (x,y).
When X and Y are continuous, the expected value of g(X,Y) is similarly defined but with a double integral of g(x,y)f(x,y) replacing the double summation.
Some properties of expected values are described. Conditional expected values are also defined here, and the page explains how to obtain unconditional expected values from conditional ones.
The means and variances of individual variables can be found as expected values from their marginal distributions.
This page defines the covariance of two variables and gives some properties.
The variance of the sum of two independent variables is the sum of their variances. This formula must be modified if the variables are correlated.
The correlation coefficient of X and Y is defined. It is unaffected by linear transformations of the two variables.
The correlation coefficient must lie between –1 and +1. The values ±1 can arise if and only if the two variables are exactly linearly related.
If two variables are independent, their covariance and correlation are both zero.
The multinomial distribution is a generalisation of the binomial distribution. It describes situations where each independent trial may have more than two possible outcomes.
The marginal distributions of single variables are all binomial if their joint distribution distribution is multinomial.
The conditional distribution of any subset of variables that have a multinomial distribution is also multinomial.
The means and variances of individual variables that have a joint multinomial distribution can be found from their marginal binomial distributions. Formulae for the covariance and correlation of two of the variables are derived.
The maximum likelihood estimates of the multinomial probabilities are the corresponding sample proportions.
The standard bivariate normal distribution has a single parameter, that affects the strength of the relationship between the variables. The joint pdf is given and the distribution's shape is described.
The general bivariate normal distribution is a generalisation with five parameters. Linear transformations of standard bivariate normal variables have this distribution.
The marginal distributions are univariate normal distributions and their means and variances are four of the distribution's five parameters.
The conditional distribution of Y, given that X=x, is also a univariate normal distribution. Its shape is that of a slice through the joint pdf at x.
A formula is derived for the covariance of the two variables and their correlation is shown to be the fifth parameter of the bivariate normal distribution.