The Unapologetic Mathematician

Mathematics for the interested outsider

The Characteristic Polynomial of a Real Linear Transformation

We continue working over the field \mathbb{R} of real numbers. Again, let T:V\rightarrow V be a linear transformation from a real vector space V of dimension d to itself. We want to find the characteristic polynomial of this linear transformation.

When we had an algebraically closed field, this was easy. We took an upper-triangular matrix, and then the determinant was just the product down the diagonal. This gave one factor of the form (\lambda-\lambda_j) for each diagonal entry \lambda_j, which established that the diagonal entries of an upper-triangular matrix were exactly the eigenvalues of the linear transformation.

Now we don’t always have an upper-triangular matrix, but we can always find a matrix that’s almost upper-triangular. That is, one that looks like


where the blocks A_j are all either 1\times1 matrices

\displaystyle A_j=\begin{pmatrix}\lambda_j\end{pmatrix}

or 2\times2 matrices

\displaystyle A_j=\begin{pmatrix}a_j&b_j\\c_j&d_j\end{pmatrix}

In this latter case, we define \tau_j to be the trace a_j+d_j, and \delta_j to be the determinant a_jd_j-b_jc_j. We must find that \tau_j^2<4\delta_j, otherwise we can find another basis which breaks A_j up into two 1\times1 blocks. Let’s go a step further and insist that all the 2\times2 blocks show up first, followed by all the 1\times1 blocks.

Now we can start calculating the determinant of \lambda I_V-T, summing over permutations. Just like we saw with an upper-triangular matrix, if we have a 1\times1 block in the lower-right we have to choose the rightmost entry in the bottom column, or the whole term will be zero. So we start racking up factors (\lambda-\lambda_j) just like before. Each 1\times1 block, then, gives us a root of the characteristic polynomial, which is an eigenvalue. So far everything is the same as in the upper-triangular case.

Once we get to the 2\times2 blocks we have to be a bit more careful. We have two choices of a nonzero entry in the lowest row: \lambda-d_j or -c_j. But if we choose \lambda-d_j then we can only choose \lambda-a_j on the next row up to have a chance of a nonzero term. On the other hand, if we choose -c_j on the lowest row we are forced to choose -b_j next. The choice between these two is independent of any other choices we might make in calculating the determinant. The first always gives a factor of (\lambda-a_j)(\lambda-d_j) to the term corresponding to that permutation, while the second always gives a factor of (-b_j)(-c_j) to its term. These permutations (no matter what other choices we might make) differ by exactly one swap, and so they enter the determinant with opposite signs.

Now we can collect together all the permutations where we make one choice in block A_j, and all the permutations where we make the other choice. From the first collection we can factor out (\lambda-a_j)(\lambda-d_j), and from the second we can factor out (-b_j)(-c_j). What remains after we pull these factors out is the same in either case, so the upshot is that the 2\times2 block A_j contributes a factor of (\lambda-a_j)(\lambda-d_j)-(-b_j)(-c_j) to the determinant. Some calculation simplifies this:


which is a quadratic factor with no real roots (since we assumed that \tau_j^2<4\delta_j).

But a factor of the characteristic polynomial of this formula is exactly what we defined to be an eigenpair. That is, just as eigenvectors — roots of the characteristic polynomial — correspond to one-dimensional invariant subspaces, so too do eigenpairs — irreducible quadratic factors of the characteristic polynomial — correspond to two-dimensional invariant subspaces. The 2\times2 blocks that show up along the diagonal of the almost upper-triangular matrix give rise to the eigenpairs of T.

About these ads

April 2, 2009 - Posted by | Algebra, Linear Algebra


  1. Hi John.

    Given your approach to the determinant, did you think about the following result:
    If $U\leq V$ is $T$-invariant, and $\overline T$ is the induced map on $V/U$, then $\det T =(\det T|_U)(\det\overline T)$. This is easy from you definition if you choose a basis for $U$ and then extend. It also immediately implies that if you have any upper-triangular block matrix, then the determinant is the product of the determinants of the blocks on the diagonal. I feel that would simplify the cases of upper-diagonal and your calculations above, and is more in-keeping with the basis-free ideology.

    Just a thought,

    Comment by Andrew Hubery | April 3, 2009 | Reply

  2. Sure, that’s essentially what I’m doing. But I’m trying to keep an eye on the matrix as well, if only because readers may be familiar with the usual statement of Jordan normal form (and analogous results) in terms of matrices.

    Comment by John Armstrong | April 3, 2009 | Reply

  3. [...] all of that handled we turn to calculate the characteristic polynomial of , only to find that it’s the product of the characteristic polynomials of all the blocks . [...]

    Pingback by The Multiplicity of an Eigenpair « The Unapologetic Mathematician | April 8, 2009 | Reply

  4. i don’t fin it, help me!:Gaines FJ, Thompson RC. Sets of nearly triangular matrices. Duke Math J. 1968;35(3):441–54

    Comment by triphuong | September 19, 2013 | Reply

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


Get every new post delivered to your Inbox.

Join 366 other followers

%d bloggers like this: