The Unapologetic Mathematician

Mathematics for the interested outsider

Diagonal Matrices

Even better than upper-triangular matrices are diagonal matrices. These ones look like

\displaystyle\begin{pmatrix}\lambda_1&&{0}\\&\ddots&\\{0}&&\lambda_d\end{pmatrix}

Each basis vector is an eigenvector, with the eigenvalues listed down the diagonal. It’s straightforward to show that the sum and product of two diagonal matrices are themselves diagonal. Thus, diagonal matrices form a further subalgebra inside the algebra of upper-triangular matrices. This algebra is just a direct sum of copies of \mathbb{F}, with multiplication defined component-by-component.

Diagonal matrices are especially nice because it’s really easy to see how they act on vectors. Given a diagonal matrix D, break a vector v into its components v=\sum v^ie_i. Multiply each component by the corresponding eigenvalue D(v)=\sum(\lambda_iv^i)e_i. And you’re done! Composing a diagonal matrix with another matrix is also easy. To find DT, just multiply each row of T by the corresponding eigenvalue. To find TD, multiply each column of T by the corresponding diagonal.

So, if we can find a basis for our vector space consisting only of eigenvectors for the transformation T, then with respect to that basis the matrix of T is diagonal. This is as good as we can hope for, and a lot of linear algebra comes down to determining when we can do this.

February 9, 2009 Posted by | Algebra, Linear Algebra | 4 Comments

   

Follow

Get every new post delivered to your Inbox.

Join 389 other followers