The Unapologetic Mathematician

Mathematics for the interested outsider

General Linear Groups — Generally

Monday, we saw that the general linear groups \mathrm{GL}_n(\mathbb{F}) are matrix groups, specifically consisting of those whose columns are linearly independent. But what about more general vector spaces?

Well, we know that every finite-dimensional vector space has a basis, and is thus isomorphic to \mathbb{F}^n, where n is the cardinality of the basis. So given a vector space V with a basis \{f_i\} of cardinality n, we have the isomorphism S:\mathbb{F}^n\rightarrow V defined by S(e_i)=f_i and S^{-1}(f_i)=e_i.

This isomorphism of vector spaces then induces an isomorphism of their automorphism groups. That is, \mathrm{GL}(V)\cong\mathrm{GL}_n(\mathbb{F}). Given an invertible linear transformation T:V\rightarrow V, we can conjugate it by S to get S^{-1}TS:\mathbb{F}^n\rightarrow\mathbb{F}^n. This has inverse S^{-1}T^{-1}S, and so is an element of \mathrm{GL}_n(\mathbb{F}). Thus (not unexpectedly) every invertible linear transformation from a vector space V to itself gets an invertible matrix.

But this assignment depends essentially on the arbitrary choice of the basis \{f_i\} for V. What if we choose a different basis \{\tilde{f}_i\}? Then we get a new transformation \tilde{S} and a new isomorphism of groups T\mapsto\tilde{S}^{-1}T\tilde{S}. But this gives us an inner automorphism of \mathrm{GL}_n(\mathbb{F}). Given a transformation M:\mathbb{F}^n\rightarrow\mathbb{F}^n, we get the transformation
\tilde{S}^{-1}SMS^{-1}\tilde{S}=\left(\tilde{S}^{-1}S\right)^{-1}M\left(\tilde{S}^{-1}S\right):\mathbb{F}^n\rightarrow\mathbb{F}^n
This composite \tilde{S}^{-1}S sends \mathbb{F}^n to itself, and it has an inverse. Thus changing the basis on V induces an inner automorphism of the matrix group \mathrm{GL}_n(\mathbb{F}).

Now let’s consider a linear transformation T:V\rightarrow V. We have two bases for V, and thus two different matrices — two different elements of \mathrm{GL}_n(\mathbb{F}) — corresponding to T: S^{-1}TS and \tilde{S}^{-1}T\tilde{S}. We get from one to the other by conjugation with \tilde{S}^{-1}S:

\left(\tilde{S}^{-1}S\right)S^{-1}TS\left(\tilde{S}^{-1}S\right)^{-1}=\tilde{S}^{-1}SS^{-1}TSS^{-1}\tilde{S}=\tilde{S}^{-1}T\tilde{S}

And what is this transformation \tilde{S}^{-1}S? How does it act on a basis vector in \mathbb{F}^n? We calculate:
\tilde{S}^{-1}(S(e_j))=\tilde{S}^{-1}(f_j)=\tilde{S}^{-1}(x_j^i\tilde{f}_i)=x_j^i\tilde{S}^{-1}(\tilde{f}_i)=x_j^ie_i
where f_j=x_j^i\tilde{f}_i expresses the vectors in one basis for V in terms of those of the other. That is, the jth column of the matrix X consists of the components of f_j written in terms of the \tilde{f}_i. Similarly, the inverse matrix X^{-1} with entries \tilde{x}_i^j, writes the \tilde{f}_j in terms of the f_i: \tilde{f}_i=\tilde{x}_i^jf_j.

It is these “change-of-basis” matrices that effect all of our, well, changes of basis. For example, say we have a vector v\in V with components v=v^jf_j. Then we can expand this:

v=v^jf_j=v^k\delta_k^jf_j=v^kx_k^i\tilde{x}_i^jf_j=\left(x_k^iv^k\right)\tilde{f}_i

So our components in the new basis are \tilde{v}^i=x_k^iv^k.

As another example, say that we have a linear transformation T:V\rightarrow V with matrix components t_i^j with respect to the basis \{f_i\}. That is, T(f_i)=t_i^jf_j. Then we can calculate:

T(\tilde{f}_i)=T(\tilde{x}_i^kf_k)=\tilde{x}_i^kT(f_k)=\tilde{x}_i^kt_k^lf_l=\tilde{x}_i^kt_k^lx_l^j\tilde{f}_j

and we have the new matrix components \tilde{t}_i^j=\tilde{x}_i^kt_k^lx_l^j.

October 22, 2008 Posted by | Algebra, Group Examples, Linear Algebra | 12 Comments

The General Linear Groups

Not just any general group \mathrm{GL}(V) for any vector space V, but the particular groups \mathrm{GL}_n(\mathbb{F}). I can’t put LaTeX, or even HTML subscripts in post titles, so this will have to do.

The general linear group \mathrm{GL}_n(\mathbb{F}) is the automorphism group of the vector space \mathbb{F}^n of n-tuples of elements of \mathbb{F}. That is, it’s the group of all invertible linear transformations sending this vector space to itself. The vector space \mathbb{F}^n comes equipped with a basis \{e_i\}, where e_i has a {1} in the ith place, and {0} elsewhere. And so we can write any such transformation as an n\times n matrix.

Let’s look at the matrix of some invertible transformation T:
\displaystyle\begin{pmatrix}t_1^1&t_2^1&\cdots&t_n^1\\t_1^2&t_2^2&\cdots&t_n^2\\\vdots&\vdots&\ddots&\vdots\\t_1^n&t_2^n&\cdots&t_n^n\end{pmatrix}

How does it act on a basis element? Well, let’s consider its action on e_1:
\displaystyle\begin{pmatrix}t_1^1&t_2^1&\cdots&t_n^1\\t_1^2&t_2^2&\cdots&t_n^2\\\vdots&\vdots&\ddots&\vdots\\t_1^n&t_2^n&\cdots&t_n^n\end{pmatrix}\begin{pmatrix}1\\{0}\\\vdots\\{0}\end{pmatrix}=\begin{pmatrix}t_1^1\\t_1^2\\\vdots\\t_1^n\end{pmatrix}

It just reads off the first column of the matrix of T. Similarly, T(e_i) will read off the ith column of the matrix of T. This works for any linear endomorphism of \mathbb{F}^n: its columns are the images of the standard basis vectors. But as we said last time, an invertible transformation must send a basis to another basis. So the columns of the matrix of T must form a basis for \mathbb{F}^n.

Checking that they’re a basis turns out to be made a little easier by the special case we’re in. The vector space has dimension n, and we’ve got n column vectors to consider. If all n are linearly independent, then the column rank of the matrix is n. Then the dimension of the image of T is n, and thus T is surjective.

On the other hand, any vector T(v) in the image of T is a linear combination of the columns of the matrix of T (use the components of v as coefficients). If these columns are linearly independent, then the only combination adding up to the zero vector has all coefficients equal to {0}. And so T(v)=0 implies v=0, and T is injective.

Thus we only need to check that the columns of the matrix of T are linearly independent to know that T is invertible.

Conversely, say we’re given a list of n linearly independent vectors f_i in \mathbb{F}^n. They must be a basis, since any linearly independent set can be completed to a basis, and a basis of \mathbb{F}^n must have exactly n elements, which we already have. Then we can use the f_i as the columns of a matrix. The corresponding transformation T has T(e_i)=f_i, and extends from there by linearity. It sends a basis to a basis, and so must be invertible.

The upshot is that we can consider this group as a group of n\times n matrices. They are exactly the ones so that the set of columns is linearly independent.

October 20, 2008 Posted by | Algebra, Group Examples, Linear Algebra | 1 Comment

Braid groups

Okay, time for a group I really like.

Imagine you’re playing the shell game. You’re mixing up some shells on the surface of a table, and you can’t lift them up. How can you rearrange them? At first, you might think this is just a permutation group all over again, but not quite. Let’s move two shells around each other, taking a picture of the table of the table each moment, and then stack those pictures like a flip-book movie. I’ve drawn just such a picture.
Braid Crossing

We read this movie from the bottom to the top. The shell on the right moves behind the shell on the left as they switch places. It could also have moved in front of the left shell, though, and that picture would show the paths crossing the other way. We don’t want to consider those two pictures the same.

So why don’t we want to? Because the paths those shells trace out look a lot like the strands of knots! The two dimensions of the table and one of time make a three-dimensional space we can use to embed knots. Since these pictures just have a bunch of strands running up and down, crossing over and under each other as they go we call them “braids”. In fact, these movies form a group. We compose movies by running them one after another. We always bring the n shells back where they started so we can always start the next movie with no jumps. We get the identity by just leaving the shells alone. Finally, we can run a movie backwards to invert it.

There’s one such braid group B_n for each number n of shells. The first one, B_1 is trivial since there’s nothing to do — there’s no “braiding” going on with one strand. The second one, B_2 is just a copy of the integers again, counting how many twists like the one pictured above we’ve done. Count the opposite twist as -1. Notice that this is already different from the symmetric groups, where S_2 just has the two moves, “swap the letters” or “leave them alone”.

Beyond here the groups B_n and S_n get more and more different, but they’re also pretty tightly related. If we perform a braiding and then forget which direction we made each crossing we’re just left with a permutation. Clearly every permutation can arise from some braiding, so we have an epimorphism from B_n onto S_n. In fact, this shows up when we try to give a presentation of the braid group.

Recall that the symmetric group has presentation:

<s_1,...,s_{n-1}|s_i^2 (1\leq i\leq n-1),s_is_js_i^{-1}s_j^{-1} (|i-j|\geq 2),
s_is_{i+1}s_is_{i+1}s_is_{i+1} (1\leq i\leq n-2)>

The generator s_i swaps the contents of places i and i+1. The relations mean that swapping twice undoes a swap, widely spaced swaps can be done in either order, and another seemingly more confusing relation that’s at least easily verified. The braid group looks just like this, except now a twist is not its own inverse. So get rid of that first relation:

<s_1,...,s_{n-1}|s_is_js_i^{-1}s_j^{-1} (|i-j|\geq 2),s_is_{i+1}s_is_{i+1}s_is_{i+1} (1\leq i\leq n-2)>

The fact that we get from the braid group to the symmetric group by adding relations reflects the fact that S_n is a quotient of B_n. It’s interesting to play with this projection and compute its kernel.

March 14, 2007 Posted by | Algebra, Group Examples, Group theory, Knot theory | 8 Comments

A difficult exercise

While proctoring the make-up exam for my class, I thought of an exercise related to my group theory posts on direct and free products that should cause even the more experienced mathematicians in the audience a bit of difficulty.

Consider four groups A_1, A_2, B_1, and B_2, and four homomorphisms:

  • f_{1,1}:A_1\rightarrow B_1
  • f_{1,2}:A_1\rightarrow B_2
  • f_{2,1}:A_2\rightarrow B_1
  • f_{2,2}:A_2\rightarrow B_2

Use these to construct two homomorphisms from A_1*A_2 to B_1\times B_2, and show that they’re the same.

March 1, 2007 Posted by | Algebra, Group Examples, Group theory, Universal Properties | 2 Comments

The Airplane Seat solution

And now, the solution to the Airplane Seat problem.

Obviously there’s a 1% probability that the first passenger chooses his own seat, in which case the last one automatically gets his. Then there’a a 1% chance that the first chooses the second passenger’s seat, who then has a 1/99 chance of choosing her own, or a 100% chance of this all getting way too complicated.

So let’s look for something not quite so obvious. There are various ways of solving the problem, but the one I’m interested in uses the fact that what we’re really looking at is a permutation!

(1\,a_2\,a_3\,...\,a_n)

This also has the property that the sequence of a_i only increases. We need the probability that a_n is not 100.

Let’s say that it is. Then we have the cycle (1\,a_2\,a_3\,...\,a_{n-1}\,100). We can stick the transposition (a_{n-1}\,100) after it to give (1\,a_2\,a_3\,...\,a_{n-1}). This is a valid cycle that doesn’t end with 100.

What if we have a cycle that doesn’t end with 100? If we write it as (1\,a_2\,a_3\,...\,a_{n-1}) we can again stick the transposition (a_{n-1}\,100) after it. Now this gives (1\,a_2\,a_3\,...\,a_{n-1}\,100), which does end with 100.

Even better, if we start with any cycle, ending with 100 or not, and do both transformations we get back the exact same cycle we started with. We say that the two transformations are inverses of each other, which just means that each undoes what the other does. A fancier word for an invertible function between two sets is “bijection”.

So we’ve split all valid sequences into ones where 100 gets his own seat and ones where he doesn’t, and shown that there’s a bijection between them. That means that the two sets have the same size! So picking a sequence at random we find that in exactly half the cases 100 will get his own seat.

Like Euclid used to say, Q.E.D.

February 5, 2007 Posted by | Algebra, Group Examples, Group theory | 13 Comments

The Airplane Seat problem

I thought I’d drop a little problem to think about for a while. If you’ve seen it before, please don’t ruin it for those who haven’t by posting a solution in the comments.

Let’s imagine an airplane with 100 seats. Instead of the cattle-call boarding procedures we know, passengers board the plane one at a time. On this flight, the first passenger has lost his ticket, so he just picks a seat at random to sit in. When the second passenger boards, if her own seat is available she sits in it. If not, she picks a random empty seat. This repeats until the plane fills. What is the probability that the 100th passenger gets to sit in his own seat?

Assume that all the random choices are made with equal probability of picking any currently-empty seat.

February 4, 2007 Posted by | Algebra, Group Examples, Group theory | 3 Comments

   

Follow

Get every new post delivered to your Inbox.

Join 394 other followers