Independence

Interact

If the elements of X are mutually independent then Cov(Xi,Xj)=0 for all ij and hence the covariance matrix Σ is a diagonal matrix and the ith diagonal element is Var(Xi).

In the other direction, zero covariance doesn’t imply independence, and pairwise independence doesn’t imply mutual independence. But the multivariate normal is a wonderful distribution:

If X is multivariate normal and its elements are pairwise uncorrelated – that is, Cov(Xi,Xj)=0 for all ij – then the elements of X are mutually independent.

That is, multivariate normal random variables are independent if and only if they are uncorrelated.

This is easy to see from the form of the density of X. If Σ is a diagonal matrix then so is Σ1. The ith diagonal element of Σ1 is 1/σ2i where σ2i=Var(Xi). So

(xμ)Σ1(xμ) = ni=1(xiμ(i))2σ2i

and therefore

exp(12(xμ)Σ1(xμ)) = ni=1exp(12(xiμ(i)σi)2)

In the constant of integration, det(Σ)=σ21σ22σ2n.

Therefore the density of X is the product of the marginal normal densities.

Sum and Difference, Revisited

Let X=[X1,X2]T have a bivariate normal distribution. Let S=X1+X2 and D=X1X2. We know that S and D have a bivariate normal distribution and that

Cov(S,D) = Var(X1)Var(X2)

If X1 and X2 have the same variance then S and D are uncorrelated, and hence also independent by what we have just proved.

Thus for example the sum and difference of two i.i.d. normal random variables are independent.

You have shown in exercises that the sum and differences of any two i.i.d. random variables are uncorrelated. If in addition the two variables are normal, then their sum and difference are independent, not just uncorrelated.