Independence

Interact

If the elements of $\mathbf{X}$ are mutually independent then $Cov(X_i, X_j) = 0$ for all $i \ne j$ and hence the covariance matrix $\boldsymbol{\Sigma}$ is a diagonal matrix and the $i$th diagonal element is $Var(X_i)$.

In the other direction, zero covariance doesn’t imply independence, and pairwise independence doesn’t imply mutual independence. But the multivariate normal is a wonderful distribution:

If $\mathbf{X}$ is multivariate normal and its elements are pairwise uncorrelated – that is, $Cov(X_i, X_j) = 0$ for all $i \ne j$ – then the elements of $\mathbf{X}$ are mutually independent.

That is, multivariate normal random variables are independent if and only if they are uncorrelated.

This is easy to see from the form of the density of $\mathbf{X}$. If $\boldsymbol{\Sigma}$ is a diagonal matrix then so is $\boldsymbol{\Sigma}^{-1}$. The $i$th diagonal element of $\boldsymbol{\Sigma}^{-1}$ is $1/\sigma_i^2$ where $\sigma_i^2 = Var(X_i)$. So

and therefore

In the constant of integration, $\det(\boldsymbol{\Sigma}) = \sigma_1^2 \sigma_2^2 \cdots \sigma_n^2$.

Therefore the density of $\mathbf{X}$ is the product of the marginal normal densities.

Sum and Difference, Revisited

Let $\mathbf{X} = [X_1, X_2]^T$ have a bivariate normal distribution. Let $S = X_1 + X_2$ and $D = X_1 - X_2$. We know that $S$ and $D$ have a bivariate normal distribution and that

If $X_1$ and $X_2$ have the same variance then $S$ and $D$ are uncorrelated, and hence also independent by what we have just proved. Thus for example the sum and difference of two i.i.d. normal random variables are independent.