Since the components of the differential are given by partial derivatives, and partial derivatives (like all single-variable derivatives) are linear, it’s straightforward to see that the differential operator is linear as well. That is, if and are two functions, both of which are differentiable at a point , and and are real constants, then the linear combination is also differentiable at , and the differential is given by
There’s not usually a product for function values in , so there’s not usually any analogue of the product rule and definitely none of the quotient rule, so we can ignore those for now.
But we do have a higher-dimensional analogue for the chain rule. If we have a function defined on some open region and another function defined on a region that contains the image , then we can compose them to get a single function defined by . And if is differentiable at a point and is differentiable at the image point , then the composite function is differentiable at .
First of all, what should the differential be? Remember that the differential is a linear transformation that takes displacements from the point and turns them into displacements from the point . Then the differential is a linear transformation that takes displacements from the point and turns them into displacements from the point . Putting these together, we have a composite linear transformation that will start with a linear transformation that takes displacements from the point and turns them into displacements from the point . I assert that this is composite transformation is exactly the differential of the composite function.
Just as a sanity check, what happens when we look at single-variable real-valued functions? In this case, and are both linear transformations from one-dimensional spaces to other one-dimensional spaces. That is, they’re represented as matrices that just multiply by the single real entry. So the composite of the two transformations is given by the matrix whose single entry is the product of the two matrices’ single entries. In other words, in one variable the differentials looks like single real numbers and , and their composite is given by multiplication: . This is exactly the one-variable chain rule. To understand multiple variables we have to move from products of real numbers to compositions of linear transformations, which will be products of real matrices.
Okay, so let’s verify that does indeed act as a differential for . It’s clearly a linear transformation between the appropriate two spaces of displacements. What we need to verify is that it gives a good approximation. That is, for every there is a so that if we have
First of all, since is differentiable at , given there is a so that if we have
Now since is differentiable it satisfies a Lipschitz condition. We showed that this works for real-valued functions, but extending the result is very straightforward. That is, there is some radius and a constant so that if we have the inequality . That is, cannot stretch displacements by more than a factor of as long as the displacements are small enough.
Now may be smaller than already, but just in case let’s shrink it until it is. Then we know that
so we can use this difference as a displacement from . We find
Now we’re going to find a constant and a radius so that
whenever . Once this is established, we are done. Given an we can set and let be the smaller of the two resulting radii and . Within this smaller radius, the desired inequality will hold.
To get this result, we choose orthonormal coordinates on the space . We can then use these coordinates to write
But since each of the several is differentiable we can pick our radius so that all of the inequalities
hold for . Then we let be the magnitude of the largest of the component partial derivatives , and we’re done.
Thus when is differentiable at and is differentiable at , then the composite is differentiable at , and the differential of the composite function is given by
the composite of the differentials, considered as linear transformations.