Suppose we have two random variables β could be column vectors or scalars.
By definition, Covariance is:
By definition, Variance is:
Suppose R is a vector, the Covariance Matrix of R is:
Warning
The Covariance matrix is symmetric, and positive semidefinite (PSD).
- If are independent .
- The reverse is not true.
- Thus, pairwise independent Var(R) is diagonal.
- Again, the reverse is not true.
If all features pairwise independent, then we can write the joint normal pdf as .
If Var(R) is diagonal, that means the ellipsoid (for ) are axes-aligned with squared radii on diagonal of .
Cite
So when the features are independent, you can write the multivariate Gaussian PDF as a product of univariate Gaussian PDFs. When they arenβt, you can do a change of coordinates to the eigenvector coordinate system, and write it as a product of univariate Gaussian PDFs in eigenvector coordinates. You did something very similar in Q6.2 of Homework 2.