## Symmetric, Antisymmetric, and Hermitian Forms

The simplest structure we can look for in our bilinear forms is that they be symmetric, antisymmetric, or (if we’re working over the complex numbers) Hermitian. A symmetric form gives the same answer, an antisymmetric form negates the answer, and a Hermitian form conjugates the answer when we swap inputs. Thus if we have a symmetric form given by the linear operator , an antisymmetric form given by the operator , and a Hermitian form given by , we can write

Each of these conditions an immediately be translated into a condition on the corresponding linear operator. We’ll flip over each of the terms on the left, using the symmetry of the inner product and the adjoint property. In the third line, though, we’ll use the *conjugate*-symmetry of the complex inner product.

We can conjugate both sides of the last line to simplify it. Similarly, we can use linearity in the second line to rewrite

Now in each line we have one operator on the left and another operator on the right, and these operators give rise to the same forms. I say that this means the operators themselves must be the same. To show this, consider the general case

Pulling both forms to one side and using linearity we find

Now, if the difference is not the zero transformation, then there is some so that . Then we can consider

And so we must have .

In particular, this shows that if we have a symmetric form, it’s described by a self-adjoint transformation . Hermitian forms are also described by self-adjoint transformations . And antisymmetric forms are described by “skew-adjoint” transformations

So what’s the difference between a symmetric and a Hermitian form? It’s all in the fact that a symmetric form is based on a vector space with a symmetric inner product, while a Hermitian form is based on a complex vector space with a conjugate-symmetric inner product. The different properties of the two inner products account for the different ways that adjoint transformations behave.

How much of this can be generalized to vector spaces over a field other than the complex numbers? For instance, if conjugation is replaced by some field automorphism, does the spectral theorem still hold for Hermitian forms?

Comment by Akhil Mathew | July 11, 2009 |

Well, I don’t know offhand. But it’s worth looking into, as I approach the spectral theorem.

One thing I’ll say is that it seems you not only need an automorphism, but one that relates to some sensible norm in a similar way. I’m not really an expert on these things, but it might be that you need to consider modules over a C* algebra to get the appropriate generalization.

Comment by John Armstrong | July 11, 2009 |

But does the spectral theorem for finite-dimensional vector spaces even need a norm? At least, doesn’t the version for symmetric bilinear forms work over arbitrary fields (say of characteristic )?

Comment by Akhil Mathew | July 11, 2009 |

Interesting question: the relevant statement is, I think, “Sylvester’s law of inertia”. The wikipedia page: http://en.wikipedia.org/wiki/Symmetric_bilinear_form#Signature_and_Sylvester.27s_law_of_inertia

seems to suggest that this doesn’t hold over an arbitrary field (it says “Ordered Field”, but clearly it also works for the complex numbers).

The obvious problem is that working if F_p say, there is no notion of “positive” or “negative”. But it’s not hard to see that the number of zeroes down the diagonal must be basis independent: this is just the dimension of the “radical”.

The beauty about what John is doing (over the reals, and especially the complexes) is that it all generalises beautifully to the infinite-dimensional setting: here you really do need a norm, as you need to work with complete spaces.

Comment by Matt Daws | July 13, 2009 |

Yes, once I get around to more functional analysis, I’ll be able to import a lot of this stuff wholesale. And, again, I’ll admit to not being an expert on finite characteristic.

Comment by John Armstrong | July 13, 2009 |

[…] must be symmetric (or conjugate-symmetric). This is satisfied by picking our transformation to be symmetric (or hermitian). But we also need our form to be “positive-definite”. That is, we […]

Pingback by Positive-Definite Transformations « The Unapologetic Mathematician | July 13, 2009 |

[…] In this way, unitary and orthogonal transformations are related in a way similar to that in which Hermitian and symmetric forms are […]

Pingback by Unitary Transformations « The Unapologetic Mathematician | July 28, 2009 |

[…] That is, the adjoint matrix is the conjugate transpose. This isn’t really anything new, since we essentially saw it when we considered Hermitian matrices. […]

Pingback by Unitary and Orthogonal Matrices « The Unapologetic Mathematician | July 29, 2009 |

[…] in our analogy — self-adjoint and unitary (or orthogonal), and even anti-self-adjoint (antisymmetric and “skew-Hermitian”) transformations satisfying — all satisfy one slightly […]

Pingback by Normal Transformations « The Unapologetic Mathematician | August 5, 2009 |

[…] I said above, this is a bilinear form. Further, Clairaut’s theorem tells us that it’s a symmetric form. Then the spectral theorem tells us that we can find an orthonormal basis with respect to […]

Pingback by Classifying Critical Points « The Unapologetic Mathematician | November 24, 2009 |

[…] already seen that the composition of a linear transformation and its adjoint is self-adjoint and positive-definite. In terms of complex matrices, this tells us that the product of a matrix and […]

Pingback by Hom Space Duals « The Unapologetic Mathematician | October 13, 2010 |