An important special case of a linear system is a set of homogenous equations. All this means (in this case) is that the right side of each of the equations is zero.
In matrix notation (using the summation convention), we have the equation . Remember that this is actually a collection of equations, one for each value of the index . And in our more abstract notation we write , where the right had side is the zero vector in .
So what is a solution of this system? It’s a vector that gets sent to by the linear transformation . But a vector that gets sent to the zero vector is exactly one in the kernel . So solving the homogenous system is equivalent to determining the kernel of the linear transformation .
We don’t yet have any tools for making this determination yet, but we can say some things about the set of solutions. For one thing, they form a subspace of . That is, the sum of any two solutions is again a solution, and a constant multiple of any solution is again a solution. We’re interested, then, in finding linearly independent solutions, because from them we can construct more solutions without redundancy.
A maximal collection of linearly independent solutions will be a basis for the subspace of solutions — for the kernel of the linear map. As such, the number of solutions in any maximal collection will be the dimension of this subspace, which we called the nullity of the linear transformation . The rank-nullity theorem then tells us that we have a relationship between the number of independent solutions to the system (the nullity), the number of variables in the system (the dimension of ), and the rank of , which we will also call the rank of the system. Thus if we can learn ways to find the rank of the system then we can determine the number of independent solutions.