The Inverse Function Theorem
At last we come to the theorem that I promised. Let be continuously differentiable on an open region , and . If the Jacobian determinant at some point , then there is a uniquely determined function and two open sets and so that
- , and
- is injective on
- is defined on , , and for all
- is continuously differentiable on
The Jacobian determinant is continuous as a function of , so there is some neighborhood of so that the Jacobian is nonzero within . Our second lemma tells us that there is a smaller neighborhood on which is injective. We pick some closed ball centered at , and use our first lemma to find that must contain an open neighborhood of . Then we define , which is open since both and are (the latter by the continuity of ). Since is injective on the compact set , it has a uniquely-defined continuous inverse on . This establishes the first four of the conditions of the theorem.
Now the hard part is showing that is continuously differentiable on . To this end, like we did in our second lemma, we define the function
along with a neighborhood of so that as long as all the are within this function is nonzero. Without loss of generality we can go back and choose our earlier neighborhood so that , and thus that .
To show that the partial derivative exists at a point , we consider the difference quotient
with also in for sufficiently small . Then writing and we find . The mean value theorem then tells us that
for some (no summation on ). As usual, is the Kronecker delta.
This is a linear system of equations, which has a unique solution since the determinant of its matrix is . We use Cramer’s rule to solve it, and get an expression for our difference quotient as a quotient of two determinants. This is why we want the form of the solution given by Cramer’s rule, and not by a more computationally-efficient method like Gaussian elimination.
As approaches zero, continuity of tells us that approaches , and thus so do all of the . Therefore the determinant in the denominator of Cramer’s rule is in the limit , and thus limits of the solutions given by Cramer’s rule actually do exist.
This establishes that the partial derivative exists at each . Further, since we found the limit of the difference quotient by Cramer’s rule, we have an expression given by the quotient of two determinants, each of which only involves the partial derivatives of , which are themselves all continuous. Therefore the partial derivatives of not only exist but are in fact continuous.
[…] The Implicit Function Theorem II Okay, today we’re going to prove the implicit function theorem. We’re going to think of our function as taking an -dimensional vector and a -dimensional vector and giving back an -dimensional vector . In essence, what we want to do is see how this output vector must change as we change , and then undo that by making a corresponding change in . And to do that, we need to know how changing the output changes , at least in a neighborhood of . That is, we’ve got to invert a function, and we’ll need to use the inverse function theorem. […]
Pingback by The Implicit Function Theorem II « The Unapologetic Mathematician | November 20, 2009 |
[…] all of these cases, we know that the inverse function exists because of the inverse function theorem. Here the Jacobian determinant is simply the derivative , which we’re assuming is everywhere […]
Pingback by Change of Variables in Multiple Integrals I « The Unapologetic Mathematician | January 5, 2010 |
[…] assume that is injective and that the Jacobian determinant is everywhere nonzero on . The inverse function theorem tells us that we can define a continuously differentiable inverse on all of the image […]
Pingback by Change of Variables in Multiple Integrals II « The Unapologetic Mathematician | January 6, 2010 |
[…] the inverse function theorem from multivariable calculus: if is a map defined on an open region , and if the Jacobian of has […]
Pingback by The Inverse Function Theorem « The Unapologetic Mathematician | April 14, 2011 |