The Unapologetic Mathematician

Mathematics for the interested outsider

Symmetric, Antisymmetric, and Hermitian Forms

The simplest structure we can look for in our bilinear forms is that they be symmetric, antisymmetric, or (if we’re working over the complex numbers) Hermitian. A symmetric form gives the same answer, an antisymmetric form negates the answer, and a Hermitian form conjugates the answer when we swap inputs. Thus if we have a symmetric form given by the linear operator S, an antisymmetric form given by the operator A, and a Hermitian form given by H, we can write

\displaystyle\begin{aligned}\langle w\rvert S\lvert v\rangle&=\langle v\rvert S\lvert w\rangle\\\langle w\rvert A\lvert v\rangle&=-\langle v\rvert A\lvert w\rangle\\\langle w\rvert H\lvert v\rangle&=\overline{\langle v\rvert H\lvert w\rangle}\end{aligned}

Each of these conditions an immediately be translated into a condition on the corresponding linear operator. We’ll flip over each of the terms on the left, using the symmetry of the inner product and the adjoint property. In the third line, though, we’ll use the conjugate-symmetry of the complex inner product.

\displaystyle\begin{aligned}\langle v\rvert S^*\lvert w\rangle&=\langle v\rvert S\lvert w\rangle\\\langle v\rvert A^*\lvert w\rangle&=-\langle v\rvert A\lvert w\rangle\\\overline{\langle v\rvert H^*\lvert w\rangle}&=\overline{\langle v\rvert H\lvert w\rangle}\end{aligned}

We can conjugate both sides of the last line to simplify it. Similarly, we can use linearity in the second line to rewrite

\displaystyle\begin{aligned}\langle v\rvert S^*\lvert w\rangle&=\langle v\rvert S\lvert w\rangle\\\langle v\rvert A^*\lvert w\rangle&=\langle v\rvert-A\lvert w\rangle\\\langle v\rvert H^*\lvert w\rangle&=\langle v\rvert H\lvert w\rangle\end{aligned}

Now in each line we have one operator on the left and another operator on the right, and these operators give rise to the same forms. I say that this means the operators themselves must be the same. To show this, consider the general case

\displaystyle\langle v\rvert B_1\lvert w\rangle=\langle v\rvert B_2\lvert w\rangle

Pulling both forms to one side and using linearity we find

\displaystyle\langle v\rvert B_1-B_2\lvert w\rangle=0

Now, if the difference B_1-B_2 is not the zero transformation, then there is some w_0 so that B_1(w_0)-B_2(w_0)=v_0\neq0. Then we can consider

\displaystyle\langle v_0\rvert B_1-B_2\lvert w_0\rangle=\langle v_0\vert v_0\rangle=\lVert v_0\rVert^2\neq0

And so we must have B_1=B_2.

In particular, this shows that if we have a symmetric form, it’s described by a self-adjoint transformation S^*=S. Hermitian forms are also described by self-adjoint transformations H^*=H. And antisymmetric forms are described by “skew-adjoint” transformations A^*=-A

So what’s the difference between a symmetric and a Hermitian form? It’s all in the fact that a symmetric form is based on a vector space with a symmetric inner product, while a Hermitian form is based on a complex vector space with a conjugate-symmetric inner product. The different properties of the two inner products account for the different ways that adjoint transformations behave.

About these ads

July 10, 2009 - Posted by | Algebra, Linear Algebra

11 Comments »

  1. How much of this can be generalized to vector spaces over a field other than the complex numbers? For instance, if conjugation is replaced by some field automorphism, does the spectral theorem still hold for Hermitian forms?

    Comment by Akhil Mathew | July 11, 2009 | Reply

  2. Well, I don’t know offhand. But it’s worth looking into, as I approach the spectral theorem.

    One thing I’ll say is that it seems you not only need an automorphism, but one that relates to some sensible norm in a similar way. I’m not really an expert on these things, but it might be that you need to consider modules over a C* algebra to get the appropriate generalization.

    Comment by John Armstrong | July 11, 2009 | Reply

  3. But does the spectral theorem for finite-dimensional vector spaces even need a norm? At least, doesn’t the version for symmetric bilinear forms work over arbitrary fields (say of characteristic \neq 2)?

    Comment by Akhil Mathew | July 11, 2009 | Reply

  4. Interesting question: the relevant statement is, I think, “Sylvester’s law of inertia”. The wikipedia page: http://en.wikipedia.org/wiki/Symmetric_bilinear_form#Signature_and_Sylvester.27s_law_of_inertia
    seems to suggest that this doesn’t hold over an arbitrary field (it says “Ordered Field”, but clearly it also works for the complex numbers).

    The obvious problem is that working if F_p say, there is no notion of “positive” or “negative”. But it’s not hard to see that the number of zeroes down the diagonal must be basis independent: this is just the dimension of the “radical”.

    The beauty about what John is doing (over the reals, and especially the complexes) is that it all generalises beautifully to the infinite-dimensional setting: here you really do need a norm, as you need to work with complete spaces.

    Comment by Matt Daws | July 13, 2009 | Reply

  5. Yes, once I get around to more functional analysis, I’ll be able to import a lot of this stuff wholesale. And, again, I’ll admit to not being an expert on finite characteristic.

    Comment by John Armstrong | July 13, 2009 | Reply

  6. [...] must be symmetric (or conjugate-symmetric). This is satisfied by picking our transformation to be symmetric (or hermitian). But we also need our form to be “positive-definite”. That is, we [...]

    Pingback by Positive-Definite Transformations « The Unapologetic Mathematician | July 13, 2009 | Reply

  7. [...] In this way, unitary and orthogonal transformations are related in a way similar to that in which Hermitian and symmetric forms are [...]

    Pingback by Unitary Transformations « The Unapologetic Mathematician | July 28, 2009 | Reply

  8. [...] That is, the adjoint matrix is the conjugate transpose. This isn’t really anything new, since we essentially saw it when we considered Hermitian matrices. [...]

    Pingback by Unitary and Orthogonal Matrices « The Unapologetic Mathematician | July 29, 2009 | Reply

  9. [...] in our analogy — self-adjoint and unitary (or orthogonal), and even anti-self-adjoint (antisymmetric and “skew-Hermitian”) transformations satisfying — all satisfy one slightly [...]

    Pingback by Normal Transformations « The Unapologetic Mathematician | August 5, 2009 | Reply

  10. [...] I said above, this is a bilinear form. Further, Clairaut’s theorem tells us that it’s a symmetric form. Then the spectral theorem tells us that we can find an orthonormal basis with respect to [...]

    Pingback by Classifying Critical Points « The Unapologetic Mathematician | November 24, 2009 | Reply

  11. [...] already seen that the composition of a linear transformation and its adjoint is self-adjoint and positive-definite. In terms of complex matrices, this tells us that the product of a matrix and [...]

    Pingback by Hom Space Duals « The Unapologetic Mathematician | October 13, 2010 | Reply


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 388 other followers

%d bloggers like this: