Even better than upper-triangular matrices are diagonal matrices. These ones look like
Each basis vector is an eigenvector, with the eigenvalues listed down the diagonal. It’s straightforward to show that the sum and product of two diagonal matrices are themselves diagonal. Thus, diagonal matrices form a further subalgebra inside the algebra of upper-triangular matrices. This algebra is just a direct sum of copies of , with multiplication defined component-by-component.
Diagonal matrices are especially nice because it’s really easy to see how they act on vectors. Given a diagonal matrix , break a vector into its components . Multiply each component by the corresponding eigenvalue . And you’re done! Composing a diagonal matrix with another matrix is also easy. To find , just multiply each row of by the corresponding eigenvalue. To find , multiply each column of by the corresponding diagonal.
So, if we can find a basis for our vector space consisting only of eigenvectors for the transformation , then with respect to that basis the matrix of is diagonal. This is as good as we can hope for, and a lot of linear algebra comes down to determining when we can do this.