Properties of Covariance

Interact

Let’s examine how covariance behaves. In the next two sections we will use our observations to calculate variances of sample sums.

Establishing properties of covariance involves simple observations and routine algebra. We have done some of it below, and we expect that you can fill in the rest.

Recall that the covariance of $X$ and $Y$ is

Constants Don’t Vary

That title has a “duh” quality. But it’s still worth noting that for any constant $c$,

Variance is a Covariance

Covariance is an extension of the concept of variance, because

The variance of $X$ is the covariance of $X$ and itself.

Covariance is Symmetric

Clearly $Cov(Y, X) = Cov(X, Y)$. It follows that

This way of thinking about the variance of a sum will be useful later.

Covariance and Expected Products

Covariance is an expected product: it is the expected product of deviations. It can also be written in terms of the expected product of $X$ and $Y$, as follows.

So covariance is the mean of the product minus the product of the means. Take $X = Y$ to get the familiar fact that variance is the mean of the square minus the square of the mean.

This result simplifies proofs of facts about covariance, as you will see below. But as a computational tool, it is only useful when the distributions of $X$ and $Y$ are very simple – for example, when each has just a few possible values. In other calculations of covariance it is rarely a good idea to try to use this result. Rather, we will use the properties below.

Addition Rule

A routine application of the property above shows that for any random variables $X$, $Y$, and $Z$,

Just write $Cov(X+Y, Z) = E[(X+Y)Z] - E(X+Y)E(Z)$, expand both products, and collect terms.

The Main Property: Bilinearity

This is the key to using covariance. First, easy algebra shows that for constants $a$ and $b$,

Put this together with the addition rule to get

You can see that covariance behaves like products. By induction,

That might look intimidating, but in fact this property greatly simplifies calculation. It says that you can expand covariance like the product of two sums. For example,

You can replace $Cov(Y, Y)$ by $Var(Y)$.

These properties simplify calculations. But they don’t give a sense of what covariance means. Indeed it’s not easy to understand covariance physically as it has nasty units: for example, if $X$ is a weight in kilograms and $Y$ is height in centimeters, then the units of covariance is kilogram centimeters. Later in the course we will see how to normalize covariance to get the correlation coefficient that you used so often in Data 8. For now, here is a property that begins to show you that covariance can be helpful in quantifying dependence and independence.

Independent Implies Uncorrelated

Let $X$ and $Y$ be independent. Then

Therefore if $X$ and $Y$ are independent, then $Cov(X, Y) = 0$. We say that $X$ and $Y$ are uncorrelated.

We have shown that independent random variables are uncorrelated. But it is not true that uncorrelated random variables have to be independent. You will show this in an exercise.