## The Taylor Series of the Exponential Function

Sorry for the lack of a post yesterday, but I was really tired after this weekend.

So what functions might we try finding a power series expansion for? Polynomials would be boring, because they already *are* power series that cut off after a finite number of terms. What other interesting functions do we have?

Well, one that’s particularly nice is the exponential function . We know that this function is its own derivative, and so it has infinitely many derivatives. In particular, , , , …, , and so on.

So we can construct the Taylor series at . The coefficient formula tells us

which gives us the series

We use the ratio test to calculate the radius of convergence. We calculate

Thus the series converges absolutely no matter what value we pick for . The radius of convergence is thus infinite, and the series converges everywhere.

But does this series converge back to the exponential function? Taylor’s Theorem tells us that

where there is some between and so that .

Now the derivative of is again, and takes only positive values. And so we know that is everywhere increasing. What does this mean? Well, if then , and so . On the other hand if then , and so . Either way, we have some uniform bound on no matter what the are.

So now we know . And it’s not too hard to see (though I can’t seem to find it in my archives) that grows much faster than for any fixed . Basically, the idea is that each time you’re multiplying by , which eventually gets less than and stays less than one. The upshot is that the remainder term must converge to for any fixed , and so the series indeed converges to the function .