The Unapologetic Mathematician

Mathematics for the interested outsider

(Real) Frobenius Reciprocity

Now we come to the real version of Frobenius reciprocity. It takes the form of an adjunction between the functors of induction and restriction:


where V is an H-module and W is a G-module.

This is one of those items that everybody (for suitable values of “everybody”) knows to be true, but that nobody seems to have written down. I’ve been beating my head against it for days and finally figured out a way to make it work. Looking back, I’m not entirely certain I’ve ever actually proven it before.

So let’s start on the left with a linear map f:V\to W that intertwines the action of each subgroup element h\in H\subseteq G. We want to extend this to a linear map from V\!\!\uparrow_H^G to W that intertwines the actions of all the elements of G.

Okay, so we’ve defined V\!\!\uparrow_H^G=\mathbb{C}[G]\otimes_HV. But if we choose a transversal \{t_i\} for H — like we did when we set up the induced matrices — then we can break down \mathbb{C}[G] as the direct sum of a bunch of copies of \mathbb{C}[H]:


So then when we take the tensor product we find


So we need to define a map from each of these summands t_iV to W. But a vector in t_iV looks like t_iv for some v\in V. And thus a G-intertwinor \hat{f} extending f must be defined by \hat{f}(t_iv)=t_i\hat{f}(v)=t_if(v).

So, is this really a G-intertwinor? After all, we’ve really only used the fact that it commutes with the actions of the transversal elements t_i. Any element of the induced representation can be written uniquely as

\displaystyle v=\sum\limits_{i=1}^nt_iv_i

for some collection of v_i\in V. We need to check that \hat{f}(gv)=g\hat{f}(v).

Now, we know that left-multiplication by g permutes the cosets of H. That is, gt_i=t_{\sigma(i)}h_i for some h_i\in H. Thus we calculate

\displaystyle gv=\sum\limits_{i=1}^ngt_iv_i=\sum\limits_{i=1}^nt_{\sigma(i)}h_iv_i

and so, since \hat{f} commutes with h and with each transversal element


Okay, so we’ve got a map f\mapsto\hat{f} that takes H-module morphisms in \hom_H(V,W\!\!\downarrow^G_H) to G-module homomorphisms in \hom_G(V\!\!\uparrow_H^G,W). But is it an isomorphism? Well we can get go from \hat{f} back to f by just looking at what \hat{f} does on the component

\displaystyle V=1V\subseteq\bigoplus\limits_{i=1}^nt_iV

If we only consider the actions elements h\in H, they send this component back into itself, and by definition they commute with \hat{f}. That is, the restriction of \hat{f} to this component is an H-intertwinor, and in fact it’s the same as the f we started with.


December 3, 2010 Posted by | Algebra, Group theory, Representation Theory | 2 Comments

Induction and Restriction are Additive Functors

Before we can prove the full version of Frobenius reciprocity, we need to see that induction and restriction are actually additive functors.

First of all, functoriality of restriction is easy. Any intertwinor f:V\to W between G-modules is immediately an intertwinor between the restrictions V\!\!\downarrow^G_H and W\!\!\downarrow^G_H. Indeed, all it has to do is commute with the action of each h\in H\subseteq G on the exact same spaces.

Functoriality of induction is similarly easy. If we have an intertwinor f:V\to W between H-modules, we need to come up with one between \mathbb{C}[G]\otimes_HV and \mathbb{C}[G]\otimes_HW. But the tensor product is a functor on each variable, so it’s straightforward to come up with 1_{\mathbb{C}[G]}\otimes f. The catch is that since we’re taking the tensor product over H in the middle, we have to worry about this map being well-defined. The tensor s\otimes v\in\mathbb{C}[G]\otimes V is equivalent to sh^{-1}\otimes hv. The first gets sent to s\otimes f(v), while the second gets sent to sh^{-1}\otimes f(hv)=sh^{-1}\otimes hf(v). But these are equivalent in \mathbb{C}[G]\otimes_HW, so the map is well-defined.

Next: additivity of restriction. If V and W are G-modules, then so is V\oplus W. The restriction (V\oplus W)\!\!\downarrow^G_H is just the restriction of this direct sum to H, which is clearly the direct sum of the restrictions V\!\!\downarrow^G_H\oplus W\!\!\downarrow^G_H.

Finally we must check that induction is additive. Here, the induced matrices will come in handy. If X and Y are matrix representations of H, then the direct sum is the matrix representation

\displaystyle\left[X\oplus Y\right](h)=\left(\begin{array}{cc}X(h)&0\\{0}&Y(h)\end{array}\right)

And then the induced matrix looks like:

\displaystyle \left[X\oplus Y\right]\!\!\uparrow_H^G(g)=\left(\begin{array}{ccccccc}X(t_1^{-1}gt_1)&0&X(t_1^{-1}gt_2)&0&\cdots&X(t_1^{-1}gt_n)&0\\{0}&Y(t_1^{-1}gt_1)&0&Y(t_1^{-1}gt_2)&\cdots&0&Y(t_1^{-1}gt_n)\\X(t_2^{-1}gt_1)&0&X(t_2^{-1}gt_2)&0&\cdots&X(t_2^{-1}gt_n)&0\\{0}&Y(t_2^{-1}gt_1)&0&Y(t_2^{-1}gt_2)&\cdots&0&Y(t_2^{-1}gt_n)\\\vdots&\vdots&\vdots&\vdots&\ddots&\vdots&\vdots\\X(t_n^{-1}gt_1)&0&X(t_n^{-1}gt_2)&0&\cdots&X(t_n^{-1}gt_n)&0\\{0}&Y(t_n^{-1}gt_1)&0&Y(t_n^{-1}gt_2)&\cdots&0&Y(t_n^{-1}gt_n)\end{array}\right)

Now, it’s not hard to see that we can rearrange the basis to make the matrix look like this:


There’s no complicated mixing up of basis elements amongst each other; just rearranging their order is enough. And this is just the direct sum X\!\!\uparrow_H^G\oplus Y\!\!\uparrow_H^G.

December 1, 2010 Posted by | Algebra, Group theory, Representation Theory | 1 Comment