The Unapologetic Mathematician

Mathematics for the interested outsider

Another Existence Proof

I’d like to go back and give a different proof that the Picard iteration converges — one which is closer to the spirit of Newton’s method. In that case, we proved that Newton’s method converged by showing that the derivative of the iterating function was less than one at the desired solution, making it an attracting fixed point.

In this case, however, we don’t have a derivative because our iteration runs over functions rather than numbers. We will replace it with a similar construction called the “functional derivative”, which is a fundamental part of the “calculus of variations”. I’m not really going to go too deep into this field right now, and I’m not going to prove the analogous result that a small functional derivative means an attracting fixed point, but it’s a nice exercise and introduction anyway.

So, we start with the Picard iteration again:

\displaystyle P[v](t)=a+\int\limits_0^tF(v(s))\,ds

We consider what happens when we add an adjustment to v:

\displaystyle\begin{aligned}P[v+h](t)&=a+\int\limits_0^tF(v(s)+h(s))\,ds\\&\approx a+\int\limits_0^tF(v(s))+dF(v(s))h(s)\,ds\\&=a+\int\limits_0^tF(v(s))\,ds+\int\limits_0^tdF(v(s))h(s)\,ds\\&=P[v](t)+\int\limits_0^tdF(v(s))h(s)\,ds\end{aligned}

We call the small change the “variation” of v, and we write \delta v=h. Similarly, we call the difference between P[v+\delta v] and P[v] the variation of P and write \delta P. It turns out that controlling the size of the variation \delta v gives us some control over the size of the variation \delta P. To wit, if \lVert\delta v\rVert_\infty\leq d then we find

\displaystyle\begin{aligned}\left\lVert\int\limits_0^tdF(v(s))\delta v(s)\,ds\right\rVert&\leq\int\limits_0^t\lVert dF(v(s))\delta v(s)\rVert\,ds\\&\leq\int\limits_0^t\lVert dF(v(s))\rVert_\text{op}\lVert\delta v(s)\rVert_\infty\,ds\\&\leq d\int\limits_0^t\lVert dF(v(s))\rVert_\text{op}\,ds\end{aligned}

Now our proof that F is locally Lipschitz involved showing that there’s a neighborhood of a where we can bound \lVert dF(x)\rVert_\text{op} by K. Again we can pick a small enough c so that \lvert s\rvert c implies that v(s) stays within this neighborhood, and also such that cK<1. And then we conclude that \lVert\delta P\rVert_\infty\leq d, which we can also write as

\displaystyle\frac{\delta P}{\delta v}<1

Now, admittedly this argument is a bit handwavy as it stands. Still, it does go to show the basic idea of the technique, and it’s a nice little introduction to the calculus of variations.

May 10, 2011 - Posted by | Analysis, Calculus of Variations, Differential Equations

1 Comment »

  1. […] Armstrong: Another Existence Proof (of the convergence of the Picard iteration), Gronwall’s Inequality, Lie […]

    Pingback by Sixth Linkfest | May 25, 2011 | Reply

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: