Let’s focus back in on a real, finite-dimensional vector space and give it an inner product. As a symmetric bilinear form, the inner product provides us with an isomorphism . Now we can use functoriality to see what this does for our tensor algebras. Again, I’ll be mostly interested in the exterior algebra , so I’ll stick to talking about that one.
The isomorphism sends a vector to the linear functional . Functoriality then defines an isomorphism that sends the wedge of degree to the wedge , also of degree . This is the antisymmetrization of the tensor product of all these linear functionals. We’ve seen that we can consider this as a linear functional on the space of degree tensors by applying the functionals to the tensorands in order and then multiplying together all the results. This defines an isomorphism , and extending by linearity we find an isomorphism .
Let’s get a little more explicit about how this works by picking an orthonormal basis for , and the corresponding dual basis for . That is, we have — which defines the isomorphism from to explicitly in terms of bases — and .
Now we can use the to write down an explicit basis of . An element of degree is the sum of wedges of vectors . We can write each of these vectors out in terms of components , getting the wedge (really a sum of wedges)
We factor out all the scalar components to get
If in a given term we ever have two of the indices equal to each other, then the whole wedge will be zero by antisymmetry. On the other hand, if none of them are equal we can sort them into increasing order (at the possible cost of multiplying by the sign of the needed permutation). In the end, we can write down any wedge of degree uniquely as a sum of constants times the basic wedges , where . For example, if has basis , then will have basis
where the lines correspond to the different degrees.
Now it’s obvious how the isomorphism acts on this basis. It just turns a wedge of basis vectors like into a wedge of basis linear functionals like . The action on the rest of just extends by linearity. When we compose this with the isomorphism between and , we get an isomorphism . That is, we have an inner product on the algebra !
Let’s consider how this inner product behaves on our basis Clearly to line these up we need the degrees to be equal. We also find that we get zero unless the collections of indices are the same. For example, if we try to pair with , we find
In each arrangement, we’ll find two indices that don’t line up, and thus each term will be zero. On the other hand, if the collections of indices are the same, we find (for example)
When we consider a basic wedge of degree (here, ) and pair it with itself, we’ll have a sum of terms corresponding to summing over permutations of both tensors. Of these, terms that pick different permutations will have at least one pair of basis vectors that don’t line up, and make the whole term zero. The remaining terms that pick the same permutation twice will give the product of copies of , and this will always occur with a positive sign. This will exactly cancel one of the two normalizing factors from the antisymmetrizers, and thus the inner product of a basic wedge of degree with itself will always be . It’s not an orthonormal basis, but it’s close.
Notice, in particular, how in the second example we’ve avoided explicit use of the dual basis and just defined the inner product on tensors of rank as the -fold product of inner products of vectors. We’ll stick to this notation in the future for tensors.
The factor of isn’t really terrible, but it can get annoying. Often the inner product on is modified to compensate for it. We consider the different degrees to be orthogonal, as before, and we define the inner product in degree to include an extra factor of . This has the effect of making the collection of wedges of basis vectors into an orthonormal basis for , but it means that the inner product on wedges can not be calculated simply by considering them as antisymmetric tensors.
Now, I’ve never really looked closely at exactly what happens, so as an experiment I’m going to try to not use this extra factor of and see what happens. I’ll refer, as I do, to the “renormalized” inner product on , where appropriate. And if the work starts becoming too complicated without this factor, I’ll give in and use it, explicitly saying when I’ve given up.