# The Unapologetic Mathematician

## The Real Spectral Theorem

Let’s take the last couple lemmas we’ve proven and throw them together to prove the real analogue of the complex spectral theorem. We start with a self-adjoint transformation $S:V\rightarrow V$ on a finite-dimensional real inner-product space $V$.

First off, since $S$ is self-adjoint, we know that it has an eigenvector $e_1$, which we can pick to have unit length (how?). The subspace $\mathbb{R}e_1$ is then invariant under the action of $S$. But then the orthogonal complement $V_1=\left(\mathbb{R}e_1\right)^\perp$ is also invariant under $S$. So we can restrict it to a transformation $S_1:V_1\rightarrow V_1$.

It’s not too hard to see that $S_1$ is also self-adjoint, and so it must have an eigenvector $e_2$, which will also be an eigenvector of $S$. And we’ll get an orthogonal complement $V_2$, and so on. Since every step we take reduces the dimension of the vector space we’re looking at by one, we must eventually bottom out. At that point, we have an orthonormal basis of eigenvectors for our original space $V$. Each eigenvector was picked to have unit length, and each one is in the orthogonal complement of those that came before, so they’re all orthogonal to each other.

Just like in the complex case, if we have a basis and a matrix already floating around for $S$, we can use this new basis to perform a change of basis, which will be orthogonal (not unitary in this case). That is, we can write the matrix of any self-adjoint transformation $S$ as $O\Lambda O^{-1}$, where $O$ is an orthogonal matrix and $\Lambda$ is diagonal. Alternately, since $O^{-1}=O^*$, we can think of this as $O\Lambda O^*$, in case we’re considering our transformation as representing a bilinear form (which self-adjoint transformations often are).

What if we’ve got this sort of representation? A transformation with a matrix of the form $O\Lambda O^*$ must be self-adjoint. Indeed, we can take its adjoint to find

$\displaystyle\left(O\Lambda O^*\right)^*=\left(O^*\right)^*\Lambda^*O^*=O\Lambda^*O^*$

but since $\Lambda$ is diagonal, it’s automatically symmetric, and thus represents a self-adjoint transformation. Thus if a real transformation has an orthonormal basis of eigenvectors, it must be self-adjoint.

Notice that this is a somewhat simpler characterization than in the complex case. This hinges on the fact that for real transformations taking the adjoint corresponds to simple matrix transposition, and every diagonal matrix is automatically symmetric. For complex transformations, taking the adjoint corresponds to conjugate transposition, and not all diagonal matrices are Hermitian. That’s why we had to expand to the broader class of normal transformations.

August 14, 2009 Posted by | Algebra, Linear Algebra | 10 Comments