The Unapologetic Mathematician

Mathematics for the interested outsider

Nilpotent Transformations I

For today I want to consider a single transformation N:V\rightarrow V whose only eigenvalue is {0}. That is, N^n=0 for sufficiently large powers n. We’ve seen that d=\dim(V) is sufficiently large a power to take to check if N is nilpotent. We’ve also seen that N has an upper-triangular matrix with all zeroes along the diagonal — the single eigenvalue with multiplicity d. This sort of “nil-potent” transformation (because a power of it is the null transformation) is especially interesting because if we take any transformation, restrict it to a generalized eigenspace, and subtract the eigenvalue times the identity transformation, what’s left is nilpotent. This will be important soon.

The essential thing about N is that its kernel gets bigger and bigger as we raise it to higher and higher powers, until it swallows up the whole vector space. This much we knew, but we’re going to follow it closely, so we can describe exactly how it slips away. What we need first is a lemma: Let k be the power where N^k first equals the zero transformation. If U\subseteq V is a subspace that intersects the kernel of N^{k-1} trivially — so N doesn’t kill off anything in U until the kth (and last) iteration — then we can build the subspace U+NU+N^2U+...+N^{k-1}U. This subspace is invariant under N, since applying N just pushes everything down the line. The lemma will assert that there is another invariant subspace W so that

\displaystyle V=(U+NU+...+N^{k-1}U)\oplus W

We’ll proceed by induction on k. If k=1 then N itself is the zero transformation. Every subspace of V is invariant under N$, and so we can always find such a W.

Now, let’s say that k>1, and let U\subseteq V be a subspace so that U\cap\mathrm{Ker}(N^{k-1})=0. Then some of V is taken up by U, and some by the kernel (with no overlap). We can find U'\subseteq V another subspace so that V=U'\oplus U\oplus\mathrm{Ker}(N^{k-1}), and write U''=U'\oplus U.

We can see that NU''\subseteq\mathrm{Ker}(N^{k-1}), since k applications of N are sufficient to kill off every vector. We can also see that NU''\cap\mathrm{Ker}(N^{k-2})=0 because if anything in U'' were killed off by 1+(k-2) applications of N it would be in \mathrm{Ker}(N^{k-1}), contradicting the direct sum above. Now if we restrict to \mathrm{Ker}(N^{k-1}), we’re all set to invoke the inductive hypothesis. In this case, \mathrm{Ker}(N^{k-1}) plays the role of V and NU'' plays the role of U. The inductive hypothesis gives us a subspace W'\subseteq\mathrm{Ker}(N^{k-1}) that’s invariant under N and satisfying

\displaystyle\mathrm{Ker}(N^{k-1})=(NU''+N^2U''+...+N^{k-1}U'')\oplus W'

We can set

\displaystyle W=W'+U'+NU'+...+N^{k-1}U'

to get a subspace of V that is invariant under N (since W' and the rest of the sum are each separately invariant). This will be the subspace we need, which we must now check. First, note that

\displaystyle\begin{aligned}V&=U'+U+\mathrm{Ker}(N^{k-1})\\&=U'+U+(NU''+...+N^{k-1}U'')+W'\\&=U+NU+...+N^{k-1}U+U'+NU'+...+N^{k-1}U'+W'\\&=(U+NU+...+N^{k-1}U)+W\end{aligned}

but is the sum of W and the rest direct? We need to show that their intersection is trivial.

Take elements u_0 through u_{k-1} of U, elements u'_0 through u'_{k-1} of U', and w'\in W', and ask that

\displaystyle u_0+Nu_1+...+N^{k-1}u_{k-1}=u'_0+Nu'_1+...+N^{k-1}u'_{k-1}+w'

Applying N^{k-1} to each side of this equation we find N^{k-1}(u_0-u'_0)=0 — everything else gets killed off — which can only happen if u_0=u'_0=0. Then we’re left with

\displaystyle Nu_1+...+N^{k-1}u_{k-1}=Nu'_1+...+N^{k-1}u'_{k-1}+w'

which would imply

\displaystyle N(u_1-u'_1)+...+N^{k-1}(u_{k-1}-u'_{k-1})=w'

contradicting the directness of the sum in the above decomposition of \mathrm{Ker}(N^{k-1}) unless everything in sight is zero.

So there it is. It’s sort of messy, but at the end of the day we can start with a subspace U that doesn’t disappear for as long as possible and use N to march it around until it dies. Then the rest of V can be made up by another invariant subspace. Tomorrow we’ll see what we can do with this.

February 26, 2009 - Posted by | Algebra, Linear Algebra

6 Comments »

  1. […] we’ll finish off our treatment of nilpotent transformations by taking last Thursday’s lemma and using it to find a really nice basis for a nilpotent transformation, called a “Jordan […]

    Pingback by Nilpotent Transformations II « The Unapologetic Mathematician | March 3, 2009 | Reply

  2. […] we restrict to the generalized eigenspace with eigenvalue , the transformation is nilpotent. Thus we can find a Jordan basis for , which puts into the block-diagonal […]

    Pingback by Jordan Normal Form « The Unapologetic Mathematician | March 4, 2009 | Reply

  3. […] expand until (after no more than iterations) we fill out an invariant subspace. Not only that, but we know that we can break up our generalized eigenspace as the direct sum of this block and another […]

    Pingback by Uniqueness of Jordan Normal Forms « The Unapologetic Mathematician | March 5, 2009 | Reply

  4. […] we know that the restrictions and are nilpotent […]

    Pingback by Decomposing Real Linear Transformations « The Unapologetic Mathematician | April 9, 2009 | Reply

  5. […] group of “upper unipotent” matrices. A matrix is unipotent if it is the identity plus a nilpotent transformation, so all of its eigenvalues are . Specifically, we want those that are also […]

    Pingback by Subgroups Generated by Shears « The Unapologetic Mathematician | August 28, 2009 | Reply

  6. […] then is spanned by a single nilpotent endomorphism, which has only the eigenvalue zero, and must have an eigenvector , proving the lemma in this […]

    Pingback by Engel’s Theorem « The Unapologetic Mathematician | August 22, 2012 | Reply


Leave a reply to Uniqueness of Jordan Normal Forms « The Unapologetic Mathematician Cancel reply