# The Unapologetic Mathematician

## General Linear Groups — Generally

Monday, we saw that the general linear groups $\mathrm{GL}_n(\mathbb{F})$ are matrix groups, specifically consisting of those whose columns are linearly independent. But what about more general vector spaces?

Well, we know that every finite-dimensional vector space has a basis, and is thus isomorphic to $\mathbb{F}^n$, where $n$ is the cardinality of the basis. So given a vector space $V$ with a basis $\{f_i\}$ of cardinality $n$, we have the isomorphism $S:\mathbb{F}^n\rightarrow V$ defined by $S(e_i)=f_i$ and $S^{-1}(f_i)=e_i$.

This isomorphism of vector spaces then induces an isomorphism of their automorphism groups. That is, $\mathrm{GL}(V)\cong\mathrm{GL}_n(\mathbb{F})$. Given an invertible linear transformation $T:V\rightarrow V$, we can conjugate it by $S$ to get $S^{-1}TS:\mathbb{F}^n\rightarrow\mathbb{F}^n$. This has inverse $S^{-1}T^{-1}S$, and so is an element of $\mathrm{GL}_n(\mathbb{F})$. Thus (not unexpectedly) every invertible linear transformation from a vector space $V$ to itself gets an invertible matrix.

But this assignment depends essentially on the arbitrary choice of the basis $\{f_i\}$ for $V$. What if we choose a different basis $\{\tilde{f}_i\}$? Then we get a new transformation $\tilde{S}$ and a new isomorphism of groups $T\mapsto\tilde{S}^{-1}T\tilde{S}$. But this gives us an inner automorphism of $\mathrm{GL}_n(\mathbb{F})$. Given a transformation $M:\mathbb{F}^n\rightarrow\mathbb{F}^n$, we get the transformation
$\tilde{S}^{-1}SMS^{-1}\tilde{S}=\left(\tilde{S}^{-1}S\right)^{-1}M\left(\tilde{S}^{-1}S\right):\mathbb{F}^n\rightarrow\mathbb{F}^n$
This composite $\tilde{S}^{-1}S$ sends $\mathbb{F}^n$ to itself, and it has an inverse. Thus changing the basis on $V$ induces an inner automorphism of the matrix group $\mathrm{GL}_n(\mathbb{F})$.

Now let’s consider a linear transformation $T:V\rightarrow V$. We have two bases for $V$, and thus two different matrices — two different elements of $\mathrm{GL}_n(\mathbb{F})$ — corresponding to $T$: $S^{-1}TS$ and $\tilde{S}^{-1}T\tilde{S}$. We get from one to the other by conjugation with $\tilde{S}^{-1}S$:

$\left(\tilde{S}^{-1}S\right)S^{-1}TS\left(\tilde{S}^{-1}S\right)^{-1}=\tilde{S}^{-1}SS^{-1}TSS^{-1}\tilde{S}=\tilde{S}^{-1}T\tilde{S}$

And what is this transformation $\tilde{S}^{-1}S$? How does it act on a basis vector in $\mathbb{F}^n$? We calculate:
$\tilde{S}^{-1}(S(e_j))=\tilde{S}^{-1}(f_j)=\tilde{S}^{-1}(x_j^i\tilde{f}_i)=x_j^i\tilde{S}^{-1}(\tilde{f}_i)=x_j^ie_i$
where $f_j=x_j^i\tilde{f}_i$ expresses the vectors in one basis for $V$ in terms of those of the other. That is, the $j$th column of the matrix $X$ consists of the components of $f_j$ written in terms of the $\tilde{f}_i$. Similarly, the inverse matrix $X^{-1}$ with entries $\tilde{x}_i^j$, writes the $\tilde{f}_j$ in terms of the $f_i$: $\tilde{f}_i=\tilde{x}_i^jf_j$.

It is these “change-of-basis” matrices that effect all of our, well, changes of basis. For example, say we have a vector $v\in V$ with components $v=v^jf_j$. Then we can expand this:

$v=v^jf_j=v^k\delta_k^jf_j=v^kx_k^i\tilde{x}_i^jf_j=\left(x_k^iv^k\right)\tilde{f}_i$

So our components in the new basis are $\tilde{v}^i=x_k^iv^k$.

As another example, say that we have a linear transformation $T:V\rightarrow V$ with matrix components $t_i^j$ with respect to the basis $\{f_i\}$. That is, $T(f_i)=t_i^jf_j$. Then we can calculate:

$T(\tilde{f}_i)=T(\tilde{x}_i^kf_k)=\tilde{x}_i^kT(f_k)=\tilde{x}_i^kt_k^lf_l=\tilde{x}_i^kt_k^lx_l^j\tilde{f}_j$

and we have the new matrix components $\tilde{t}_i^j=\tilde{x}_i^kt_k^lx_l^j$.

October 22, 2008 -

1. […] we’re drawn again to consider the special case where . Now an isomorphism is just a change of basis. Representations of are equivalent if they do “the same thing” to the vector space , […]

Pingback by Representations of a Polynomial Algebra « The Unapologetic Mathematician | October 30, 2008 | Reply

2. […] basis we started with. What if we used a different basis and dual basis ? We know that there is a change of basis matrix , so let’s see how this works on the dual […]

Pingback by The Coevaluation on Vector Spaces « The Unapologetic Mathematician | November 13, 2008 | Reply

3. […] examples of group representations. Specifically, let’s take a vector space and consider its general linear group […]

Pingback by Some Representations of the General Linear Group « The Unapologetic Mathematician | December 2, 2008 | Reply

4. […] of the General Linear Action on Matrices We’ve previously considered the representation of the general linear group on the vector space of matrices over by conjugation. What we want to […]

Pingback by Orbits of the General Linear Action on Matrices « The Unapologetic Mathematician | March 6, 2009 | Reply

5. […] a particular case, we might consider an automorphism and consider it as a change of basis. When we did this to a linear map we got a new linear map by conjugation , and we say that these […]

Pingback by Transformations of Bilinear Forms « The Unapologetic Mathematician | July 24, 2009 | Reply

6. […] even get a change-of-basis transformation (like we did for general linear transformations) defined […]

Pingback by Unitary and Orthogonal Matrices and Orthonormal Bases « The Unapologetic Mathematician | August 7, 2009 | Reply

7. […] eigenvectors, would have the diagonal matrix . But we may not be so lucky. Still, we can perform a change of basis using the basis of eigenvectors to fill in the columns of the change-of-basis matrix. And since […]

Pingback by The Complex Spectral Theorem « The Unapologetic Mathematician | August 10, 2009 | Reply

8. […] we have a basis and a matrix already floating around for , we can use this new basis to perform a change of basis, which will be orthogonal (not unitary in this case). That is, we can write the matrix of any […]

Pingback by The Real Spectral Theorem « The Unapologetic Mathematician | August 14, 2009 | Reply

9. […] do we use the matrices to manipulate matrices? Well, we’re using the elementary matrices to change the input or output bases of the linear transformation represented by the matrix. So to change the output basis we’ll […]

Pingback by Elementary Row and Column Operations « The Unapologetic Mathematician | August 27, 2009 | Reply

10. […] good is this? Well, if is an invertible matrix and is any matrix, then we find that . If is a change of basis matrix, then this tells us that the trace only depends on the linear transformation represents, and not […]

Pingback by The Character of a Representation « The Unapologetic Mathematician | October 14, 2010 | Reply

11. […] is usual, we can use the two bases to come up with a change of basis matrix. This can be used to take the components of any vector written out in terms of the basis and get […]

Pingback by Coordinate Transforms on Tangent Vectors « The Unapologetic Mathematician | April 1, 2011 | Reply

12. […] of the most important examples of a Lie group we’ve already seen: the general linear group of a finite dimensional vector space . Of course for the vector space this is the same as — […]

Pingback by General Linear Groups are Lie Groups « The Unapologetic Mathematician | June 9, 2011 | Reply