## Orthogonal Complements

An important fact about the category of vector spaces is that all exact sequences split. That is, if we have a short exact sequence

we can find a linear map from to which lets us view it as a subspace of , and we can write . When we have an inner product around and is finite-dimensional, we can do this canonically.

What we’ll do is define the orthogonal complement of to be the vector space

That is, consists of all vectors in perpendicular to every vector in .

First, we should check that this is indeed a subspace. If we have vectors , scalars , and a vector , then we can check

and thus the linear combination is also in .

Now to see that , take an orthonormal basis for . Then we can expand it to an orthonormal basis of . But now I say that is a basis for . Clearly they’re linearly independent, so we just have to verify that their span is exactly .

First, we can check that for any between and , and so their span is contained in . Indeed, if is a vector in , then we can calculate the inner product

since and . Of course, we omit the conjugation when working over .

Now, let’s say we have a vector . We can write it in terms of the full basis as . Then we can calculate its inner product with each of the basis vectors of as

Since this must be zero, we find that the coefficient of must be zero for all between and . That is, is contained within the span of

So between a basis for and a basis for we have a basis for with no overlap, we can write any vector uniquely as the sum of one vector from and one from , and so we have a direct sum decomposition as desired.

The fact that every exact sequence splits is that every module is projective. Isn’t this the same as saying the ring in question (here a field) is semisimple?

Comment by Zygmund | May 5, 2009 |

That sounds right, but I’m not really digging into ring theory like that.

Comment by John Armstrong | May 5, 2009 |

Yeah, I was trying to remember something I read a while back from Cartan and Eilenberg. Anyway, so the property then is fairly unique since semisimple algebras are basically products of matrix algebras (over division rings though).

Comment by Zygmund | May 5, 2009 |

[...] sum must be orthogonal. Incidentally, this shows that the direct sum between a subspace and its orthogonal complement is also a direct sum of inner product [...]

Pingback by The Category of Inner Product Spaces « The Unapologetic Mathematician | May 6, 2009 |

[...] particular, since the top subspace is itself, and the bottom subspace is we can see that the orthogonal complement satisfies these properties. The intersection is empty, since the inner product is [...]

Pingback by Orthogonal Complements and the Lattice of Subspaces « The Unapologetic Mathematician | May 7, 2009 |

[...] Complementation is a Galois Connection We now know how to take orthogonal complements of subspaces in an inner product space. It turns out that this process (and itself again) forms an [...]

Pingback by Orthogonal Complementation is a Galois Connection « The Unapologetic Mathematician | May 19, 2009 |

[...] (and, in particular, eigenspaces) of self-adjoint transformations. Specifically, the fact that the orthogonal complement of an invariant subspace is also [...]

Pingback by Invariant Subspaces of Self-Adjoint Transformations « The Unapologetic Mathematician | August 11, 2009 |

[...] orthonormal basis of all of . Just set to be the span of all the new basis vectors, which is the orthogonal complement of the image of , and let be the inclusion of into . We can then combine to get a unitary [...]

Pingback by The Singular Value Decomposition « The Unapologetic Mathematician | August 17, 2009 |

[...] vector in the subspace defined by the new one. That is, we want the new parallelepiped to span the orthogonal complement to the subspace we start [...]

Pingback by The Hodge Star « The Unapologetic Mathematician | November 9, 2009 |

[...] kernel of consists of all vectors orthogonal to the gradient vector , and the line it spans is the orthogonal complement to the kernel. Similarly, the kernel of consists of all vectors orthogonal to each of the gradient [...]

Pingback by Extrema with Constraints I « The Unapologetic Mathematician | November 25, 2009 |

[...] nonzero vector spans a line , and the orthogonal complement — all the vectors perpendicular to — forms an -dimensional subspace , which we can use [...]

Pingback by Reflections « The Unapologetic Mathematician | January 18, 2010 |

[...] we just consider them as vector spaces, we already know this: the orthogonal complement is exactly the subspace we need, for . I say that if is a -invariant subspace of , then is as [...]

Pingback by Invariant Forms « The Unapologetic Mathematician | September 27, 2010 |