Useful prerequisite: Miscellaneous Facts about Matrices and Decomposition

Given a square matrix A, if for some vector v 0, then v is and eigenvector of A and is the eigenvalue of A associated with vector.

Theorem If v is eigenvector of A with eigenvalue , then v is eigenvector of with eigenvalue

Proof:

Theorem: If A is invertible, then v is eigenvector of with eigenvalues

Proof:


Spectral Theorem

Every REAL, Symmetric n x n matrix has Real eigenvalues and n eigenvectors that are mutually orthogonal to each other.

  • If there is no multiplicity in eigenvalues, the directions of the eigenvectors are unique. If there exists multiplicity, choose the eigenvectors that are orthogonal.
  • Orthogonality is important for the Spectral Theorem
  • We can use them as a basis for

Building a matrix with specified eigenvectors

Choose n mutually orthogonal unit n vectors of a Symmetric Matrix . Let then, .

This V is called orthogonal matrix (in math) orthonormal matrix. Orthonormal matrix acts like a rotation / reflection. Choose some eigenvalues :

Spectral Theorem:

Note that each is a n x n matrix with rank at most 1.

This is a matrix factorization called Eigen Decomposition. Every real symmetric matrix will have this decomposition (so called spectral theorem)


Practice Exam Problem

Also Note that

Same Eigenvectors, different eigenvalues.

Same Eigenvectors, different eigenvalues.


Theorem: Symmetric Square Root

Given a Symmetric PSD matrix , we can find a symmetric square root A such that . Equivalently,

Cholesky Decomposition

Don’t get confused Symmetric Square root with Cholesky Decomposition. Cholesky Decomposition is that Given that is PD (Positive Definite) (i.e. all eigenvalues are strictly greater than 0), then , where L is a lower triangular matrix.

Given a symmetric PSD matrix , we can find a Symmetric Square Root

  • Compute eigenvectors / values of
  • Take square root of eigenvalues
  • Reassemble Matrix A

Visualizing Quadratic Form