Tensor Products over Group Algebras
So far, we’ve been taking all our tensor products over the complex numbers, since everything in sight has been a vector space over that field. But remember that a representation of is a module over the group algebra
, and we can take tensor products over this algebra as well.
More specifically, if is a right
-module, and
is a left
-module, then we have a plain vector space
. We build it just like the regular tensor product
, but we add new relations of the form
That is, in the tensor product over , we can pull actions by
from one side to the other.
If or
have extra group actions, they pass to the tensor product. For instance, if
is a left
-module as well as a right
-module, then we can define
Similarly, if has an additional right action by
, then so does
, and the same goes for extra actions on
. Similar to the way that hom spaces over
“eat up” an action of
on each argument, the tensor product
“eats up” a right action by
on its left argument and a left action by
on its right argument.
We can try to work out the dimension of this space. Let’s say that we have decompositions
into irreducible representations (possibly with repetitions). As usual for tensor products, the operation is additive, just like we saw for spaces. That is
So we really just need to understand the dimension of one of these summands. Let’s say is irreducible with dimension
and
is irreducible with dimension
.
Now, we can pick any vector and hit it with every group element
. These vectors must span
; they span some subspace, which (since
is irreducible) is either trivial or all of
. But it can’t be trivial, since it contains
itself, and so it must be
. That is, given any vector
we can find some element of the group algebra
so that
. But then for any
we have
That is, every tensor can be written with as the first tensorand. Does this mean that
? Not quite, since this expression might not be unique. For every element of the group algebra that sends
back to itself, we have a different expression.
So how many of these are there? Well, we have a linear map that sends
to
. We know that this is onto, so the dimension of the image is
. The dimension of the source is
, and so the rank-nullity theorem tells us that the dimension of the kernel — the dimension of the space that sends
back to itself — is
.
So we should be able to subtract this off from the dimension of the tensor product, due to redundancies. Assuming that this works as expected, we get , which at least is symmetric between
and
as expected. But it still feels sort of like we’re getting away with something here. We’ll come back to find a more satisfying proof soon.