Given sets of variates denoted , ..., , the covariance
of and is defined by
where and
are the means of and , respectively.
The matrix of the quantities
is called the covariance matrix.
In the special case ,
 | (3) |
giving the usual variance .
Note that statistically independent variables are always uncorrelated, but the converse is not necessarily true.
The covariance of two variates and provides a measure
of how strongly correlated these variables are, and the derived quantity
 | (4) |
where , are the standard deviations, is called
statistical correlation
of and . The covariance
is symmetric since
 | (5) |
For two variables, the covariance is related to the variance
by
 | (6) |
For two independent variates and ,
 | (7) |
so the covariance is zero. However, if the variables are correlated in some way, then their covariance will be nonzero.
In fact, if , then tends to increase
as increases. If , then
tends to decrease as increases.
The covariance obeys the identity
By induction, it therefore follows that
|