The Unapologetic Mathematician

Mathematics for the interested outsider

Isomorphisms of Vector Spaces

Okay, after that long digression into power series and such, I’m coming back to linear algebra. What we want to talk about now is how two vector spaces can be isomorphic. Of course, this means that they are connected by an invertible linear transformation, (which preserves the addition and scalar multiplication operations):

T:V\rightarrow W

First off, to be invertible the kernel of T must be trivial. Otherwise we’d have two vectors in V mapping to the same vector in W, and we wouldn’t be able to tell which one it came from in order to invert the map. Similarly, the cokernel of T must be trivial, or we’d have missed some vectors in W, and we couldn’t tell where in V to send them under the inverse map. This tells us that the index of an isomorphism must be zero, and thus that the vector spaces must have the same dimension. This seems sort of obvious, that isomorphic vector spaces would have to have the same dimension, but you can’t be too careful.

Next we note that an isomorphism sends bases to bases. That is, if \{e_i\} is a basis for V, then the collection of f_i=T(e_i) will form a basis for W.

Since T is surjective, given any w\in W there is some v\in V with T(v)=w. But v=v^ie_i uniquely (remember the summation convention) because the e_i form a basis. Then w=T(v)=T(v^ie_i)=v^iT(e_i)=v^if_i, and so we have an expression of w as a linear combination of the f_i. The collection \{f_i\} thus spans W.

On the other hand, if we have a linear combination 0=x^if_i, then we can write 0=x^iT(e_i)=T(x^ie_i). Since T is injective we find x^ie_i=0, and thus each x^i=0, since the e_i form a basis. Thus the spanning set \{f_i\} is linearly independent, and thus forms a basis.

The converse, it turns out, is also true. If \{e_i\} is a basis of V, and \{f_i\} is a basis of W, then the map T defined by T(e_i)=f_i (and extending by linearity) is an isomorphism. Indeed, we can define an inverse straight away: T^{-1}(f_i)=e_i, and extend by linearity.

The upshot of these facts is that two vector spaces are isomorphic exactly when they have the same dimension. That is, just the same way that the cardinality of a set determines its isomorphism class in the category of sets, the dimension of a vector space determines its isomorphism class in the category of vector spaces.

Now let’s step back and consider what happens in any category and throw away all the morphisms that aren’t invertible. We’re left with a groupoid, and like any groupoid it falls apart into a bunch of “connected” pieces: the isomorphism classes. In this case, the isomorphism classes are given by the dimensions of the vector spaces.

Each of these connected pieces, then, is equivalent (as a groupoid) to the automorphism group of any of its objects, all of which such groups are isomorphic. In this case, we have a name for these automorphism groups.

Given any vector space V, all the interesting information about isomorphisms to or from this group can be summed up in the “general linear group” of V, which consists of all invertible linear maps from V to itself. We write this automorphism group as \mathrm{GL}(V).

We have a special name in the case when V is the vector space \mathbb{F}^n of n-tuples of elements of the base field \mathbb{F}. In this case we write the general linear group as \mathrm{GL}(n,\mathbb{F}) or as \mathrm{GL}_n(\mathbb{F}). Since every finite-dimensional vector space over \mathbb{F} is isomorphic to one of these (specifically, the one with n=\dim(V)), we have \mathrm{GL}(V)\cong\mathrm{GL}(n,\mathbb{F}). These particular general linear groups are thus extremely important for understanding isomorphisms of finite-dimensional vector spaces. We’ll investigate these groups as we move forward.

About these ads

October 17, 2008 - Posted by | Algebra, Linear Algebra

11 Comments »

  1. I’ve just started rereading my Lin Alg book from … a decade ago(?) and I could actually sorta understand most of this despite only having just reached the chapter on matrices. I’ll need to follow those links to “index” and “groupoid”, though (never heard of the latter). Have to check “cokernel”, too.

    Seems to be an alternative way of saying “surjective”.

    Comment by Sili | October 19, 2008 | Reply

  2. A map with a trivial cokernel is surjective, just like a map with a trivial kernel is injective, yes.

    Comment by John Armstrong | October 19, 2008 | Reply

  3. Your hints toward category theory are great in this discussion. As an undergraduate, this discussion is completely accessible to me – thank you. Also, I have looked at the morphisms of vector spaces and modules that you mentioned, and have attempted to attribute their structure preserving qualities to the general module’s underlying algebraic structure (the Abelian group of vectors, and the operator set; a ring or a field). Linear Algebra is a great field, and I look forward to exploring its extensions as I move ahead in analysis in the reals, and later in function spaces.

    Comment by tom | October 20, 2008 | Reply

  4. [...] for any linear endomorphism of : its columns are the images of the standard basis vectors. But as we said last time, an invertible transformation must send a basis to another basis. So the columns of the matrix of [...]

    Pingback by The General Linear Groups « The Unapologetic Mathematician | October 20, 2008 | Reply

  5. [...] We’ve now got the general linear group of all invertible linear maps from a vector space to itself. Incidentally this lives inside the [...]

    Pingback by Group Representations « The Unapologetic Mathematician | October 23, 2008 | Reply

  6. [...] Special Linear Group (and others) We’ve got down the notion of the general linear group of a vector space , including the particular case of the matrix group of the space . We also have [...]

    Pingback by The Special Linear Group (and others) « The Unapologetic Mathematician | September 8, 2009 | Reply

  7. [...] The shears alone generate the special linear group. Can we strip them down any further? And, with this in mind, how many generators does it take to build up the whole general linear group? [...]

    Pingback by How Many Generators? « The Unapologetic Mathematician | September 11, 2009 | Reply

  8. plz give a simple defination of isomorphism of vector space,I realy need it.

    Comment by hina | April 20, 2011 | Reply

  9. [...] exerpt from Link: Isomorphisms of Vector Spaces [...]

    Pingback by how two vector spaces can be isomorphic | WeiYao's Blog | April 3, 2012 | Reply

  10. Question: Are basis sent to basis if the vector spaces are infinite-dim. ? Thank you!!

    Comment by Lena | September 17, 2012 | Reply

    • Yes, its true (if anyone but me wanted to know :-)

      Comment by Lena | September 18, 2012 | Reply


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 394 other followers

%d bloggers like this: