# The Unapologetic Mathematician

## Analytic Functions

Okay, we know that power series define functions, and that the functions so defined have derivatives, which have power series expansions. And thus these derivatives have derivatives themselves, and so on. Thus a function defined by a power series in a given disk is actually infinitely differentiable within that same disk.

What about the other way? Say if we have a function $f$ with arbitrarily high derivatives at the point $z_0$. We know that if this function has a power series about $z_0$, then the only possible sequence of coefficients is given by the formula

$\displaystyle a_k=\frac{f^{(k)}(z_0)}{k!}$

But does this sequence actually give a power series expansion of $f$? That is, does the (in)famous “Taylor series”

$\displaystyle\sum\limits_{k=0}^\infty\frac{f^{(k)}(z_0)}{k!}(z-z_0)^k$

converge to the function $f$ in any neighborhood of $z_0$? If so, we’ll call the function “analytic” at $z_0$.

So, are all infinitely-differentiable functions analytic? Are all functions for which the Taylor series at $z_0$ actually the limit of said Taylor series near $z_0$? Well, the fact that we have a special name should give a hint that the answer isn’t always “yes”.

We’ve been working with complex power series, but let’s specialize now to real power series. That is, all the coefficients are real, we center them around real points, and they converge within a real disk — an interval — of a given radius.

Now in this context we can consider the function defined by $f(x)=e^{-x^{-2}}$ away from $x=0$. It’s straightforward to calculate

$\displaystyle\lim\limits_{x\rightarrow0}e^{-x^{-2}}=0$

And if we define $f(0)=0$ it turns out to even be differentiable there. The derivative turns out to be $\left(2x^{-3}\right)e^{-x^{-2}}$. And we can also calculate

$\displaystyle\lim\limits_{x\rightarrow0}2x^{-3}e^{-x^{-2}}=0$

And so on. The $n$th-derivative will be $P_n(x^{-1})e^{-x^{-2}}$, where $P_n$ is a polynomial. We can calculate

$\frac{d}{dx}P_n(x^{-1})e^{-x^{-2}}=\left(P'_n(x^{-1})(-x^{-2})+P_n(x^{-1})(2x^{-3})\right)e^{-x^{-2}}$

That is, we can set $P_0=1$ and $P_{n+1}=2X^3P_n-X^2P_n'$, and thus recursively define a sequence of polynomials.

Thus for each degree we have a polynomial in $x^{-1}$ multiplied by $e^{-x^{-2}}$, and in the limit the latter clearly wins the race. For each derivative we can fill in the “gap” at $x=0$ by defining $f^{(n)}(0)=0$.

But now when we set up the Taylor series around $x_0=0$ what happens? The series is

$\displaystyle\sum\limits_{k=0}^\infty\frac{f^{(n)}(0)}{k!}x^k=\sum\limits_{k=0}^\infty\frac{0}{k!}x^k$

Which clearly converges to the constant function ${0}$. That is, the Taylor series of this function at $x_0=0$ converges to nothing like the function itself. This function is infinitely differentiable at ${0}$, but it is not analytic there.

There are a lot of theorems about what conditions on an infinitely-differentiable function make it analytic, but I’m going to leave them alone for now.

September 27, 2008 Posted by | Analysis, Calculus | 5 Comments