## Real Invariant Subspaces

Okay, there’s been a bit of a gap while things have gotten hectic around here, but let’s try and get back to work.

When we worked over an algebraically closed field, every polynomial had a root. Applying this to the characteristic polynomial of a linear transformation, we found that it must have a root, which would be definition be an eigenvalue of the transformation. There would then be an eigenvector, which gave a one-dimensional invariant subspace. But when we look at real vector spaces we might not have any one-dimensional invariant subspaces. However, if we don’t we will be sure to have a *two*-dimensional invariant subspace.

So let’s start with a linear transformation from a real vector space of dimension to itself. Pick any vector and construct the sequence of images , , and so on up to . Together with the original vector, these vectors cannot be linearly independent, since there are more than of them. Thus we have a linear relation

We can regard this as a polynomial in applied to , which has real coefficients. We can factor it to write

Note that either or could be zero, in which case there are no factors of that form. Also, note that we have no reason to believe that this linear combination has anything to do with the characteristic polynomial, so this factorization is *not* necessarily giving us eigenvalues or eigenpairs.

All that we can conclude is that at least one of these factors is not injective. If it’s a factor , then all the factors after that point act on to give a vector satisfying

This gives a one-dimensional invariant subspace. On the other hand, if one of the quadratic factors is not injective, then all the factors after that point act on to give a vector satisfying

which shows that the vectors and span a two-dimensional invariant subspace, since both basis vectors are sent to a linear combination of each other under the action of .

Thus we can always find an invariant subspace of dimension one or two. It’s not quite as neat as over the complex numbers, but it’s something we can work with.

Can you prove this result by using the structure theorem for finitely generated modules over principal ideal domains? V is a module over R[X], so V= R[X]/p_1 + … R[X]/p_r (direct sum with the p_i prime powers) and each summand is invariant. But the dimension of the summands might be greater than two if the p_i are proper powers. How does one resolve that?

Comment by Zygmund | March 31, 2009 |

Zygmund, what you’re moving towards is a result I’m also moving towards, but in a more explicit and accessible way. I’d suggest going back and comparing what you’re saying to the decomposition I gave of a complex vector space, and why we needed

generalizedeigenvectors.Comment by John Armstrong | March 31, 2009 |

[…] able to find an eigenvector — a one-dimensional invariant subspace — but we know that we can find either a one-dimensional or a two-dimensional invariant subspace . Just like before we get an […]

Pingback by Almost Upper-Triangular Matrices « The Unapologetic Mathematician | April 1, 2009 |

[…] Mathematics In my linear algebra class, we talked about eigenvectors and eigenspaces. This blog post is about a cousin of eigenvectors: invariant subspaces. […]

Pingback by 2 April 2009 « blueollie | April 2, 2009 |

how can u tell that

u, Tu

span a two dimensional subspace? I mean, why cant they be linearly dependent?

Comment by Rafael | March 10, 2010 |

Because, Rafael, if they were linearly dependent then we’d have and be in the first case, not the second.

Comment by John Armstrong | March 10, 2010 |