Determining Eigenvalues
Yesterday, we defined eigenvalues, eigenvectors, and eigenspaces. But we didn’t talk about how to actually find them (though one commenter decided to jump the gun a bit). It turns out that determining the eigenspace for any given eigenvalue is the same sort of problem as determining the kernel.
Let’s say we’ve got a linear endomorphism and a scalar value
. We want to determine the subspace of
consisting of those eigenvectors
satisfying the equation
First, let’s adjust the right hand side. Instead of thinking of the scalar product of and
, we can write it as the action of the transformation
, where
is the identity transformation on
. That is, we have the equation
Now we can do some juggling to combine these two linear transformations being evaluated at the same vector :
And we find that the -eigenspace of
is the kernel
.
Now, as I stated yesterday most of these eigenspaces will be trivial, just as the kernel may be trivial. The interesting stuff happens when is nontrivial. In this case, we’ll call
an eigenvalue of the transformation
(thus the eigenvalues of a transformation are those which correspond to nonzero eigenvectors). So how can we tell whether or not a kernel is trivial? Well, we know that the kernel of an endomorphism is trivial if and only if the endomorphism is invertible. And the determinant provides a test for invertibility!
So we can take the determinant and consider it as a function of
. If we get the value
, then the
-eigenspace of
is nontrivial, and
is an eigenvalue of
. Then we can use other tools to actually determine the eigenspace if we need to.