When will a linear transformation have a diagonal matrix? We know that this happens exactly when we can find a basis of our vector space so that each basis vector is an eigenvector . But when does this happen? The full answer can be complicated, but one sufficient condition we can describe right away.
First, remember that we can always calculate the characteristic polynomial of . This will be a polynomial of degree — the dimension of . Further, we know that a field element is an eigenvalue of if and only if is a root of the characteristic polynomial. Since the degree of the polynomial is , we can expect to find no more than distinct roots of the polynomial — no more than distinct eigenvalues of . I want to consider what happens in this generic case.
Now we have field elements , and for each one we can pick a vector so that . I say that they form a basis of . If we can show that they are linearly independent, then they span some subspace of . But since there are distinct vectors here, they span a subspace of dimension , which must be all of . And we know, by an earlier lemma that a collection of eigenvectors corresponding to distinct eigenvalues must be linearly independent! Thus, the form a basis for consisting of eigenvectors of . With respect to this basis, the matrix of is diagonal.
This is good, but notice that there are still plenty of things that can go wrong. It’s entirely possible that two (or more!) of the eigenvalues are not distinct. Worse, we could be working over a field that isn’t algebraically closed, so there may not be roots at all, even counting duplicates. But still, in the generic case we’ve got a diagonal matrix with respect to a well-chosen basis.