The Unapologetic Mathematician

Mathematics for the interested outsider

The Complex Spectral Theorem

We’re now ready to characterize those transformations on complex vector spaces which have a diagonal matrix with respect to some orthonormal basis. First of all, such a transformation must be normal. If we have a diagonal matrix we can find the matrix of the adjoint by taking its conjugate transpose, and this will again be diagonal. Since any two diagonal matrices commute, the transformation must commute with its adjoint, and is therefore normal.

On the other hand, let’s start with a normal transformation N and see what happens as we try to diagonalize it. First, since we’re working over \mathbb{C} here, we can pick an orthonormal basis that gives us an upper-triangular matrix and call the basis \left\{e_i\right\}_{i=1}^n. Now, I assert that this matrix already is diagonal when N is normal.

Let’s write out the matrices for N

\displaystyle\begin{pmatrix}a_{1,1}&\cdots&a_{1,n}\\&\ddots&\vdots\\{0}&&a_{n,n}\end{pmatrix}

and N^*

\displaystyle\begin{pmatrix}\overline{a_{1,1}}&&0\\\vdots&\ddots&\\\overline{a_{1,n}}&\cdots&\overline{a_{n,n}}\end{pmatrix}

Now we can see that N(e_1)=a_{1,1}e_1, while N^*(e_1)=\overline{a_{1,1}}e_1+\dots+\overline{a_{1,n}}e_n. Since these bases are orthonormal, it’s easy to calculate the squared-lengths of these two:

\displaystyle\begin{aligned}\lVert N(e_1)\rVert^2&=\lvert a_{1,1}\rvert^2\\\lVert N^*(e_1)\rVert^2&=\lvert a_{1,1}\rvert^2+\dots+\lvert a_{1,n}\rvert^2\end{aligned}

But since N is normal, these two must be the same. And so all the entries other than maybe a_{1,1} in the first row of our matrix must be zero. We can then repeat this reasoning with the basis vector e_2, and reach a similar conclusion about the second row, and so on until we see that all the entries above the diagonal must be zero.

That is, not only is it necessary that a transformation be normal in order to diagonalize it, it’s also sufficient. Any normal transformation on a complex vector space has an orthonormal basis of eigenvectors.

Now if we have an arbitrary orthonormal basis — say N is a transformation on \mathbb{C}^n with the standard basis already floating around — we may want to work with the matrix of N with respect to this basis. If this were our basis of eigenvectors, N would have the diagonal matrix \Lambda=\Bigl(\lambda_i\delta_{ij}\Bigr). But we may not be so lucky. Still, we can perform a change of basis using the basis of eigenvectors to fill in the columns of the change-of-basis matrix. And since we’re going from one orthonormal basis to another, this will be unitary!

Thus a normal transformation is not only equivalent to a diagonal transformation, it is unitarily equivalent. That is, the matrix of any normal transformation can be written as U\Lambda U^{-1} for a diagonal matrix \Lambda and a unitary matrix U. And any matrix which is unitarily equivalent to a diagonal matrix is normal. That is, if you take the subspace of diagonal matrices within the space of all matrices, then use the unitary group to act by conjugation on this subspace, the result is the subspace of all normal matrices, which represent normal transformations.

Often, you’ll see this written as U\Lambda U^*, which is really the same thing of course, but there’s an interesting semantic difference. Writing it using the inverse is a similarity, which is our notion of equivalence for transformations. So if we’re thinking of our matrix as acting on a vector space, this is the “right way” to think of the spectral theorem. On the other hand, using the conjugate transpose is a congruence, which is our notion of equivalence for bilinear forms. So if we’re thinking of our matrix as representing a bilinear form, this is the “right way” to think of the spectral theorem. But of course since we’re using unitary transformations here, it doesn’t matter! Unitary equivalence of endomorphisms and of bilinear forms is exactly the same thing.

August 10, 2009 - Posted by | Algebra, Linear Algebra

9 Comments »

  1. Should your first sentence end “with respect to some orthonormal basis”?

    Comment by Å | August 11, 2009 | Reply

  2. Yeah, probably…

    Comment by John Armstrong | August 11, 2009 | Reply

  3. I’m really REALLY enjoying these.

    I’m going nuts over it being 2 weeks until I theoretically am teaching full-time High School Math someplace in Los Angeles County. But no interviews, as school districts are paralyzed by the legislature having rejected several Billion dollars (yes, $10^9) of Federal Funds because they don’t like the strings attached. Really. The legislators are lobbied by the teachers unions, who would rather their members be laid off than accept money tainted with the willingness to allow ANY use of student test score in evaluating teacher merit.

    Tactically, it’s excruciating for me to do on-line applications with district-by-district tweaks of a State-based “EdJoin” online system, that requires endless attachments to everything, but won’t take .doc files or TFF files, so that I physically had to scan 50 pages of hardcopy to blurry GIF files and attach them. It’s easy of newly graduated teachers, but I have 1/3 century of teaching and professorship and research and letters of recommendation to drag around, and they don’t imagine that this makes length-limit problems for online applications. Great. Chase away the over-qualified such as me and thee, and then complain that it’s hard to find good Math profs and secondary school teachers…

    Comment by Jonathan Vos Post | August 11, 2009 | Reply

  4. It’s intruiging how all the anglo societies are setting themselves up ASAP for their own Century of Humiliation – I guess the only bright spot is that with the speed at which things move these days, it will only last a few decades

    Comment by Avery Andrews | August 12, 2009 | Reply

  5. […] the last couple lemmas we’ve proven and throw them together to prove the real analogue of the complex spectral theorem. We start with a self-adjoint transformation on a finite-dimensional real inner-product space […]

    Pingback by The Real Spectral Theorem « The Unapologetic Mathematician | August 14, 2009 | Reply

  6. […] Singular Value Decomposition Now the real and complex spectral theorems give nice decompositions of self-adjoint and normal transformations, […]

    Pingback by The Singular Value Decomposition « The Unapologetic Mathematician | August 17, 2009 | Reply

  7. […] Here’s a neat thing we can do with the spectral theorems: we can take square roots of positive-semidefinite transformations. And this makes sense, […]

    Pingback by Square Roots « The Unapologetic Mathematician | August 20, 2009 | Reply

  8. […] we work with a normal transformation on a complex inner product space we can pick orthonormal basis of eigenvectors. That is, we can find a unitary transformation and a diagonal transformation so that […]

    Pingback by Decompositions Past and Future « The Unapologetic Mathematician | August 24, 2009 | Reply

  9. […] We're now ready to characterize those transformations on complex vector spaces which have a diagonal matrix with respect to some orthonormal basis. First of all, such a transformation must be normal. If we have a diagonal matrix we can find the matrix of the adjoint by taking its conjugate transpose, and this will again be diagonal. Since any two diagonal matrices commute, the transformation must commute with its adjoint, and is therefore normal. … Read More […]

    Pingback by The Complex Spectral Theorem (via The Unapologetic Mathematician) « Blog Like Nobody Is Reading | November 18, 2010 | Reply


Leave a reply to Jonathan Vos Post Cancel reply