Let’s now consider a single inner-product space and a linear transformation . Its adjoint is another linear transformation . This opens up the possibility that might be the same transformation as . If this happens, we say that is “self-adjoint”. It then satisfies the adjoint relation
What does this look like in terms of matrices? Since we only have one vector space we only need to pick one orthonormal basis . Then we get a matrix
That is, the matrix of a self-adjoint transformation is its own conjugate transpose. We have a special name for this sort of matrix — “Hermitian” — even though it’s exactly equivalent to self-adjointness as a linear transformation. If we’re just working over a real vector space we don’t have to bother with conjugation. In that case we just say that the matrix is symmetric.
Over a one-dimensional complex vector space, the matrix of a linear transformation is simply a single complex number . If is to be self-adjoint, we must have , and so must be a real number. In this sense, the operation of taking the conjugate transpose of a complex matrix (or the simple transpose of a real matrix) extends the idea of conjugating a complex number. Self-adjoint matrices, then, are analogous to real numbers.