# The Unapologetic Mathematician

## Upper-Triangular Matrices

Until further notice, I’ll be assuming that the base field $\mathbb{F}$ is algebraically closed, like the complex numbers $\mathbb{C}$.

What does this assumption buy us? It says that the characteristic polynomial of a linear transformation $T$ is — like any polynomial over an algebraically closed field — guaranteed to have a root. Thus any linear transformation $T$ has an eigenvalue $\lambda_1$, as well as a corresponding eigenvector $e_1$ satisfying

$T(e_1)=\lambda_1e_1$

So let’s pick an eigenvector $e_1$ and take the subspace $\mathbb{F}e_1\subseteq V$ it spans. We can take the quotient space $V/\mathbb{F}e_1$ and restrict $T$ to act on it. Why? Because if we take two representatives $v,w\in V$ of the same vector in the quotient space, then $w=v+ce_1$. Then we find

$T(w)=T(v+ce_1)=T(v)+cT(e_1)=T(v)+c\lambda_1e_1$

which represents the same vector as $T(v)$.

Now the restriction of $T$ to $V/\mathbb{F}e_1$ is another linear endomorphism over an algebraically closed field, so its characteristic polynomial must have a root, and it must have an eigenvalue $\lambda_2$ with associated eigenvector $e_2$. But let’s be careful. Does this mean that $e_2$ is an eigenvector of $T$? Not quite. All we know is that

$T(e_2)=\lambda_2e_2+c_{1,2}e_1$

since vectors in the quotient space are only defined up to multiples of $e_1$.

We can proceed like this, pulling off one vector $e_i$ after another. Each time we find

$T(e_i)=\lambda_ie_i+c_{i-1,i}e_{i-1}+c_{i-2,i}e_{i-2}+...+c_{1,i}e_1$

The image of $e_i$ in the $i$th quotient space is a constant times $e_i$ itself, plus a linear combination of the earlier vectors. Further, each vector is linearly independent of the ones that came before, since if it weren’t, then it would be the zero vector in its quotient space. This procedure only grinds to a halt when the number of vectors equals the dimension of $V$, for then the quotient space is trivial, and the linearly independent collection $\{e_i\}$ spans $V$. That is, we’ve come up with a basis.

So, what does $T$ look like in this basis? Look at the expansion above. We can set $t_i^j=c_{i,j}$ for all $i. When $i=j$ we set $t_i^i=\lambda_i$. And in the remaining cases, where $i^gt;j$, we set $t_i^j=0$. That is, the matrix looks like

$\displaystyle\begin{pmatrix}\lambda_1&&*\\&\ddots&\\{0}&&\lambda_d\end{pmatrix}$

Where the star above the diagonal indicates unknown matrix entries, and the zero below the diagonal indicates that all the entries in that region are zero. We call such a matrix “upper-triangular”, since the only nonzero entries in the matrix are on or above the diagonal. What we’ve shown here is that over an algebraically-closed field, any linear transformation has a basis with respect to which the matrix of the transformation is upper-triangular. This is an important first step towards classifying these transformations.

February 2, 2009 - Posted by | Algebra, Linear Algebra

1. [...] a vector space over an algebraically-closed field has a basis with respect to which its matrix is upper-triangular. That is, it looks [...]

Pingback by The Determinant of an Upper-Triangular Matrix « The Unapologetic Mathematician | February 3, 2009 | Reply

2. [...] pick a basis and associate a matrix to each of these linear transformations. It turns out that the upper-triangular matrices form a [...]

Pingback by The Algebra of Upper-Triangular Matrices « The Unapologetic Mathematician | February 5, 2009 | Reply

3. [...] Even better than upper-triangular matrices are diagonal matrices. These ones look [...]

Pingback by Diagonal Matrices « The Unapologetic Mathematician | February 9, 2009 | Reply

4. [...] matrix is upper-triangular, and so we can just read off its eigenvalues from the diagonal: two copies of the eigenvalue . We [...]

Pingback by Repeated Eigenvalues « The Unapologetic Mathematician | February 11, 2009 | Reply

5. [...] generally, consider a strictly upper-triangular matrix, all of whose diagonal entries are zero as [...]

Pingback by Generalized Eigenvectors « The Unapologetic Mathematician | February 16, 2009 | Reply

6. [...] capture the right notion. In that example, the -eigenspace has dimension , but it seems from the upper-triangular matrix that the eigenvalue should have multiplicity [...]

Pingback by The Multiplicity of an Eigenvalue « The Unapologetic Mathematician | February 19, 2009 | Reply

7. [...] to is the multiplicity of , which is the number of times shows up on the diagonal of an upper-triangular matrix for . Since the total number of diagonal entries is , we see that the dimensions of all the [...]

Pingback by Jordan Normal Form « The Unapologetic Mathematician | March 4, 2009 | Reply

8. [...] polynomial had a root. Applying this to the characteristic polynomial of a linear transformation, we found that it must have a root, which would be definition be an eigenvalue of the transformation. There [...]

Pingback by Real Invariant Subspaces « The Unapologetic Mathematician | March 31, 2009 | Reply

9. [...] Upper-Triangular Matrices Over an algebraically closed field we can always find an upper-triangular matrix for any linear endomorphism. Over the real numbers we’re not quite so lucky, but we can come [...]

Pingback by Almost Upper-Triangular Matrices « The Unapologetic Mathematician | April 1, 2009 | Reply

10. [...] be a linear map from to itself. Further, let be a basis with respect to which the matrix of is upper-triangular. It turns out that we can also find an orthonormal basis which also gives us an upper-triangular [...]

Pingback by Upper-Triangular Matrices and Orthonormal Bases « The Unapologetic Mathematician | May 8, 2009 | Reply

11. no need of such material

Comment by shasha | May 21, 2009 | Reply

12. Oh I’m so sorry that I chose to cover a topic you see as unnecessary. I’ll be sure to run all my future topics by you first.

Comment by John Armstrong | May 21, 2009 | Reply

• Useful article. Many books just prove this by induction without any explanation

Comment by vish | October 26, 2013 | Reply

13. Surely in the first equation you meant to write
T(e_1) = lambda_1 * e_1

Comment by A Khan | June 17, 2009 | Reply

14. Yes, sorry. Thanks for catching that.

Comment by John Armstrong | June 17, 2009 | Reply

15. [...] with a complex transformation we’re done. We can pick a basis so that the matrix for is upper-triangular, and then its determinant is the product of its eigenvalues. Since the eigenvalues are all [...]

Pingback by The Determinant of a Positive-Definite Transformation « The Unapologetic Mathematician | August 3, 2009 | Reply

16. [...] Self-Adjoint Transformation has an Eigenvector Okay, this tells us nothing in the complex case, but for real transformations we have no reason to assume that a given [...]

Pingback by Every Self-Adjoint Transformation has an Eigenvector « The Unapologetic Mathematician | August 12, 2009 | Reply

17. [...] nilpotent transformation, so all of its eigenvalues are . Specifically, we want those that are also upper-triangular. Thus the matrices we’re talking about have everywhere below the diagonal and all on the [...]

Pingback by Subgroups Generated by Shears « The Unapologetic Mathematician | August 28, 2009 | Reply

18. [...] in a given row is to the right of the leftmost nonzero entry in the row above it. For example, an upper-triangular matrix is in row echelon form. We put a matrix into row echelon form by a method called [...]

Pingback by Row Echelon Form « The Unapologetic Mathematician | September 1, 2009 | Reply

19. [...] the difference between and is some scalar multiple of . On the other hand, remember how we found upper-triangular matrices before. This time we peeled off one vector and the remaining transformation was the identity on the [...]

Pingback by A Lemma on Reflections « The Unapologetic Mathematician | January 19, 2010 | Reply