The Unapologetic Mathematician

Mathematics for the interested outsider

The Multiplicity of an Eigenvalue

We would like to define the multiplicity of an eigenvalue \lambda of a linear transformation T as the number of independent associated eigenvectors. That is, as the dimension of the kernel of T-\lambda1_V. Unfortunately, we saw that when we have repeated eigenvalues, sometimes this doesn’t quite capture the right notion. In that example, the {1}-eigenspace has dimension {1}, but it seems from the upper-triangular matrix that the eigenvalue {1} should have multiplicity {2}.

Indeed, we saw that if the entries along the diagonal of an upper-triangular matrix are \lambda_1,...,\lambda_d, then the characteristic polynomial is

\displaystyle\prod\limits_{k=1}^d(\lambda-\lambda_k)

Then we can use our definition of multiplicity for roots of polynomials to see that a given value of \lambda has multiplicity equal to the number of times it shows up on the diagonal of an upper-triangular matrix.

It turns out that generalized eigenspaces do capture this notion, and we have a way of calculating them to boot! That is, I’m asserting that the multiplicity of an eigenvalue \lambda is both the number of times that \lambda shows up on the diagonal of any upper-triangular matrix for T, and the number of independent generalized eigenvectors with eigenvalue \lambda — which is \dim\mathrm{Ker}\left((T-\lambda1_V)^{\dim(V)}\right).

So, let’s fix a vector space V of finite dimension D over an algebraically closed field \mathbb{F}. Pick a linear transformation T:V\rightarrow V and a basis \left\{e_i\right\} with respect to which the basis of T is upper-triangular. We know such a matrix will exist because we’re working over an algebraically closed base field. I’ll prove the assertion for the eigenvalue \lambda=0 — that the number of copies of {0} on the diagonal of the matrix is the dimension of the kernel of T^d — since for other eigenvalues we just replace T with T-\lambda1_V and do the exact same thing.

We’ll prove this statement by induction on the dimension of V. The base case is easy: if \dim(V)=1 then the kernel of T has dimension {1} if the upper triangular matrix is \begin{pmatrix}{0}\end{pmatrix}, and has dimension {0} for anything else.

For the inductive step, we’re interested in the subspace spanned by the basis vectors e_1 through e_{d-1}. Let’s call this subspace U. Now we can write out the matrix of T:

\displaystyle\begin{pmatrix}\lambda_1&&*&t_1^d\\&\ddots&&\vdots\\&&\lambda_{d-1}&t_{d-1}^d\\{0}&&&\lambda_d\end{pmatrix}

We can see that every vector in U — linear combinations of e_1 through e_{d-1} — lands back in U. Meanwhile T(e_d)=\lambda_de_d+\bar{u}, where the components of \bar{u}\in U are given in the last column. The fact that U is invariant under the action of T means that we can restrict T to that subspace, getting the transformation T|_U:U\rightarrow U. Its matrix with respect to the obvious basis is

\displaystyle\begin{pmatrix}\lambda_1&&*\\&\ddots&\\{0}&&\lambda_{d-1}\end{pmatrix}

The dimension of U is less than that of V, so we can use our inductive hypothesis to conclude that {0} shows up on the diagonal of this matrix \dim\left(\mathrm{Ker}\left((T|_U)^{d-1}\right)\right) times. But we saw yesterday that the sequence of kernels of powers of T|_U has stabilized by this point (since U has dimension d-1), so this is also \dim\left(\mathrm{Ker}\left((T|_U)^d\right)\right). The last diagonal entry of T is either {0} or not. If \lambda_d\neq0, we want to show that

\displaystyle\dim\left(\mathrm{Ker}(T^d)\right)=\dim\left(\mathrm{Ker}\left((T|_U)^d\right)\right)

On the other hand, if \lambda_d=0, we want to show that

\displaystyle\dim\left(\mathrm{Ker}(T^d)\right)=\dim\left(\mathrm{Ker}\left((T|_U)^d\right)\right)+1

The inclusion-exclusion principle tells us that

\displaystyle\begin{aligned}\dim\left(U+\mathrm{Ker}(T^d)\right)&=\dim(U)+\dim\left(\mathrm{Ker}(T^d)\right)-\dim\left(U\cap\mathrm{Ker}(T^d)\right)\\&=(d-1)+\dim\left(\mathrm{Ker}(T^d)\right)-\dim\left(\mathrm{Ker}\left((T|_U)^d\right)\right)\end{aligned}

Since this dimension of a subspace has to be less than or equal to d, the difference in dimensions on the right can only be either zero or one. And we also see that

\displaystyle\mathrm{Ker}\left((T|_U)^d\right)=U\cap\mathrm{Ker}(T^d)\subseteq\mathrm{Ker}(T^d)

So if \lambda\neq0 we need to show that every vector in \mathrm{Ker}(T^d) actually lies in U, so the difference in dimensions is zero. On the other hand, if \lambda=0 we need to find a vector in \mathrm{Ker}(T^d) that’s not in U, so the difference in dimensions has to be one.

The first case is easier. Any vector in V but not in U can be written uniquely as u+xe_d for some nonzero scalar x\in\mathbb{F} and some vector u\in U. When we apply the transformation T, we get T(u)+x\bar{u}+x\lambda_de_d. Since \lambda_d\neq0, the coefficient of e_d is again nonzero. No matter how many times we apply T, we’ll still have a nonzero vector left. Thus the kernel of T^d is completely contained in U, and so we conclude \mathrm{Ker}(T^d)=\mathrm{Ker}\left((T|_U)^d\right).

In the second case, let’s look for a vector of the form u-e_d. We want to choose u\in U so that T^d(u)=T^d(e_d). At the first application of T we find T(e_d)=\bar{u}\in U. Thus

\displaystyle T^d(e_d)=T^{d-1}(\bar{u})\in\mathrm{Im}\left(\left(T|_U\right)^{d-1}\right)

But the dimension of U is d-1, and so by this point the sequence of images of powers of T|_U has stabilized! That is,

\displaystyle T^d(e_d)\in\mathrm{Im}\left(\left(T|_U\right)^{d-1}\right)\subseteq\mathrm{Im}\left(\left(T|_U\right)^d\right)

and so we can find a u so that T^d(u)=T^d(e_d). This gives us a vector in the kernel of T that doesn’t lie in U, and the inductive step is complete.

As a final remark, notice that the only place we really used the fact that \mathbb{F} is algebraically closed is when we picked a basis that would make T upper-triangular. Everything still goes through as long as we have an upper-triangular matrix, but a given linear transformation may have no such matrix.

February 19, 2009 Posted by | Algebra, Linear Algebra | 8 Comments

   

Follow

Get every new post delivered to your Inbox.

Join 389 other followers