The Unapologetic Mathematician

Mathematics for the interested outsider

The Exponential Series

What is it that makes the exponential what it is? We defined it as the inverse of the logarithm, and this is defined by integrating \frac{1}{x}. But the important thing we immediately showed is that it satisfies the exponential property.

But now we know the Taylor series of the exponential function at {0}:

\displaystyle\exp(x)=\sum\limits_{k=0}^\infty\frac{x^k}{k!}

In fact, we can work out the series around any other point the same way. Since all the derivatives are the exponential function back again, we find

\displaystyle\exp(x)=\sum\limits_{k=0}^\infty\frac{\exp(x_0)}{k!}(x-x_0)^k

Or we could also write this by expanding around a and writing the relation as a series in the displacement b=x-a:

\displaystyle\exp(a+b)=\sum\limits_{l=0}^\infty\frac{\exp(a)}{l!}b^l

Then we can expand out the \exp(a) part as a series itself:

\displaystyle\exp(a+b)=\sum\limits_{l=0}^\infty\left(\sum\limits_{k=0}^\infty\frac{a^k}{k!}\right)\frac{b^l}{l!}

But then (with our usual handwaving about rearranging series) we can pull out the inner series since it doesn’t depend on the outer summation variable at all:

\displaystyle\exp(a+b)=\left(\sum\limits_{k=0}^\infty\frac{a^k}{k!}\right)\left(\sum\limits_{l=0}^\infty\frac{b^l}{l!}\right)

And these series are just the series defining \exp(a) and \exp(b), respectively. That is, we have shown the exponential property \exp(a+b)=\exp(a)\exp(b) directly from the series expansion.

That is, whatever function the power series \sum\limits_{k=0}^\infty\frac{x^k}{k!} defines, it satisfies the exponential property. In a sense, the fact that the inverse of this function turns out to be the logarithm is a big coincidence. But it’s a coincidence we’ll tease out tomorrow.

For now I’ll note that this important exponential property follows directly from the series. And we can write down the series anywhere we can add, subtract, multiply, divide (at least by integers), and talk about convergence. That is, the exponential series makes sense in any topological ring of characteristic zero. For example, we can define the exponential of complex numbers by the series

\displaystyle\exp(z)=\sum\limits_{k=0}^\infty\frac{z^k}{k!}

Finally, this series will have the exponential property as above, so long as the ring is commutative (like it is for the complex numbers). In more general rings there’s a generalized version of the exponential property, but I’ll leave that until we eventually need to use it.

October 8, 2008 Posted by | Analysis, Calculus, Power Series | 3 Comments

The Taylor Series of the Exponential Function

Sorry for the lack of a post yesterday, but I was really tired after this weekend.

So what functions might we try finding a power series expansion for? Polynomials would be boring, because they already are power series that cut off after a finite number of terms. What other interesting functions do we have?

Well, one that’s particularly nice is the exponential function \exp. We know that this function is its own derivative, and so it has infinitely many derivatives. In particular, \exp(0)=1, \exp'(0)=1, \exp''(0)=1, …, \exp^{(n)}(0)=1, and so on.

So we can construct the Taylor series at {0}. The coefficient formula tells us

\displaystyle a_k=\frac{\exp^{(k)}(0)}{k!}=\frac{1}{k!}

which gives us the series

\displaystyle\sum\limits_{k=0}^\infty\frac{x^k}{k!}

We use the ratio test to calculate the radius of convergence. We calculate

\displaystyle\limsup\limits_{k\rightarrow\infty}\left|\frac{\frac{x^{k+1}}{(k+1)!}}{\frac{x^k}{k!}}\right|=\limsup\limits_{k\rightarrow\infty}\left|\frac{x^{k+1}k!}{x^k(k+1)!}\right|=\limsup\limits_{k\rightarrow\infty}\left|\frac{x}{(k+1)}\right|=0

Thus the series converges absolutely no matter what value we pick for x. The radius of convergence is thus infinite, and the series converges everywhere.

But does this series converge back to the exponential function? Taylor’s Theorem tells us that

\displaystyle\exp(x)=\left(\sum\limits_{k=0}^n\frac{x^k}{k!}\right)+R_n(x)

where there is some \xi_n between {0} and x so that R_n(x)=\frac{\exp(\xi_n)x^n}{(n+1)!}.

Now the derivative of \exp is \exp again, and \exp takes only positive values. And so we know that \exp is everywhere increasing. What does this mean? Well, if x\leq0 then \xi_n\leq0, and so \exp(\xi_n)\leq\exp(0)=1. On the other hand if x\geq0 then \xi_n\leq nx, and so \exp(\xi_n)\leq\exp(x). Either way, we have some uniform bound M on \exp(\xi_n) no matter what the \xi_n are.

So now we know R_n(x)\leq\frac{Mx^n}{(n+1)!}. And it’s not too hard to see (though I can’t seem to find it in my archives) that n! grows much faster than x^n for any fixed x. Basically, the idea is that each time you’re multiplying by \frac{x}{n+1}, which eventually gets less than and stays less than one. The upshot is that the remainder term R_n(x) must converge to {0} for any fixed x, and so the series indeed converges to the function \exp(x).

October 7, 2008 Posted by | Analysis, Calculus, Power Series | 5 Comments

Inverses of Power Series

Now that we know how to compose power series, we can invert them. But against expectations I’m talking about multiplicative inverses instead of compositional ones.

More specifically, say we have a power series expansion

\displaystyle p(x)=\sum\limits_{n=0}^\infty p_nz^n

within the radius r, and such that p(0)=p_0\neq0. Then there is some radius \delta within which the reciprocal has a power series expansion

\displaystyle\frac{1}{p(x)}=\sum\limits_{n=0}^\infty q_nz^n

In particular, we have q_0=\frac{1}{p_0}.

In the proof we may assume that p_0=1 — we can just divide the series through by p_0 — and so p(0)=1. We can set

\displaystyle P(z)=1+\sum\limits_{n=1}^\infty\left|p_nz^n\right|

within the radius h. Since we know that P(0)=1, continuity tells us that there’s \delta so that |z|<\delta implies |P(z)-1|<1.

Now we set

\displaystyle f(z)=\frac{1}{1-z}=\sum\limits_{n=0}^\infty z^n
\displaystyle g(z)=1-p(z)=\sum\limits_{n=0}^\infty -p_nz^n

And then we can find a power series expansion of f\left(g(z)\right)=\frac{1}{p(z)}.

It’s interesting to note that you might expect a reciprocal formula to follow from the multiplication formula. Set the product of p(z) and an undetermined q(z) to the power series 1+0z+0z^2+..., and get an infinite sequence of algebraic conditions determining q_n in terms of the p_i. Showing that these can all be solved is possible, but it’s easier to come around the side like this.

September 24, 2008 Posted by | Analysis, Calculus, Power Series | Leave a comment

Composition of Power Series

Now that we can take powers of functions defined by power series and define them by power series in the same radii.. well, we’re all set to compose functions defined by power series!

Let’s say we have two power series expansions about z=0:

\displaystyle f(z)=\sum\limits_{n=0}^\infty a_nz^n

within the radius r, and

\displaystyle g(z)=\sum\limits_{n=0}^\infty b_nz^n

within the radius R.

Now let’s take a z_1 with |z|<R and \sum\limits_{n=0}^\infty\left|b_nz_1^n\right|<r. Then we have a power series expansion for the composite:

\displaystyle f\left(g(z)\right)=\sum\limits_{n=0}^\infty c_nz^n.

The coefficients c_n are defined as follows: first, define b_n(k) to be the coefficient of z^n in the expansion of g(z)^k, then we set

\displaystyle c_n=\sum\limits_{k=0}^\infty a_kb_n(k)

To show this, first note that the hypothesis on z_1 assures that |g(z_1)|<r, so we can write

\displaystyle f\left(g(z_1)\right)=\sum\limits_{k=0}^\infty a_kg(z_1)^k=\sum\limits_{k=0}^\infty\sum\limits_{n=0}^\infty a_kb_n(k)z_1^n

If we are allowed to exchange the order of summation, then formally the result follows. To justify this (at least as well as we’ve been justifying such rearrangements recently) we need to show that

\displaystyle\sum\limits_{k=0}^\infty\sum\limits_{n=0}^\infty\left|a_kb_n(k)z_1^n\right|=\sum\limits_{k=0}^\infty\left|a_k\right|\sum\limits_{n=0}^\infty\left|b_n(k)z_1^n\right|

converges. But remember that each of the coefficients b_n(k) is itself a finite sum, so we find

\displaystyle\left|b_n(k)\right|\leq\sum\limits_{m_1+...+m_k=n}\left|b_{m_1}\right|...\left|b_{m_k}\right|

On the other hand, in parallel with our computation last time we find that

\displaystyle\left(\sum\limits_{n=0}^\infty\left|b_n\right|z^n\right)^n=\sum\limits_{n=0}^\infty B_n(k)z^n

where

\displaystyle B_n(k)=\sum\limits_{m_1+...+m_k=n}\left|b_{m_1}\right|...\left|b_{m_k}\right|

So we find

\displaystyle\begin{aligned}\sum\limits_{k=0}^\infty\left|a_k\right|\sum\limits_{n=0}^\infty\left|b_n(k)z_1^n\right|\leq\sum\limits_{k=0}^\infty\left|a_k\right|\sum\limits_{n=0}^\infty B_n(k)\left|z_1^n\right|\\=\sum\limits_{k=0}\left|a_k\right|\left(\sum\limits_{n=0}^\infty\left|b_nz_1^n\right|\right)^k\end{aligned}

which must then converge.

Breathe!

September 24, 2008 Posted by | Analysis, Calculus, Power Series | 1 Comment

Products of Power Series

Formally, we defined the product of two power series to be the series you get when you multiply out all the terms and collect terms of the same degree. Specifically, consider the series \sum\limits_{n=0}^\infty a_nz^n and \sum\limits_{n=0}^\infty b_nz^n. Their product will be the series \sum\limits_{n=0}^\infty c_nz^n, where the coefficients are defined by

\displaystyle c_n=\sum\limits_{k+l=n}a_kb_l=\sum\limits_{k=0}^na_kb_{n-k}

Now if the series converge within radii R_a and R_b, respectively, it wouldn’t make sense for the product of the functions to be anything but whatever this converges to. But in what sense is this the case?

Like when we translated power series, I’m going to sort of wave my hands here, motivating it by the fact that absolute convergence makes things nice.

Let’s take a point z_1 inside both of the radii of convergence. Then we know that the series \sum\limits_{n=0}^\infty a_nz_1^n and \sum\limits_{n=0}^\infty b_nz_1^n both converge absolutely. We want to consider the product of these limits

\displaystyle\left(\sum\limits_{k=0}^\infty a_kz_1^k\right)\left(\sum\limits_{l=0}^\infty b_lz_1^l\right)

Since the limit of the first sequence converges, we’ll just take it as a constant and distribute it over the other:

\displaystyle\sum\limits_{l=0}^\infty\left(\sum\limits_{k=0}^\infty a_kz_1^k\right)b_lz_1^l

And now we’ll just distribute each b_lz_1^l across the sum it appears with:

\displaystyle\sum\limits_{l=0}^\infty\left(\sum\limits_{k=0}^\infty a_kb_lz_1^{k+l}\right)

And now we’ll use the fact that all the series in sight converge absolutely to rearrange this sum, adding up all the terms of the same total degree at once, and pull out factors of z_1^n:

\displaystyle\sum\limits_{n=0}^\infty\left(\sum\limits_{k+l=n}a_kb_lz_1^n\right)=\sum\limits_{n=0}^\infty\left(\sum\limits_{k+l=n}a_kb_l\right)z_1^n

As a special case, we can work out powers of power series. Say that f(z)=\sum\limits_{n=0}^\infty a_n(z-z_0)^n within a radius of R. Then within the same radius of R we have

\displaystyle f(z)^p=\sum\limits_{n=0}^\infty c_n(p)z^n

where the coefficients are defined by

\displaystyle c_n(p)=\sum\limits_{m_1+m_2+...+m_p=n}a_{m_1}a_{m_2}...a_{m_p}

September 22, 2008 Posted by | Analysis, Calculus, Power Series | 2 Comments

Uniqueness of Power Series Expansions

Sorry for the delay. Grading.

Now we have power series expansions of functions around various points, and within various radii of convergence. We even have formulas to relate expansions about nearby points. But when we move from one point to a nearby point, the resulting series is only guaranteed to converge in a disk contained within the original disk. But then moving back to the original point we are only guaranteed convergence in an even smaller disk. Something seems amiss here.

Let’s look closely at the power series expansion about a given point z_0:

\displaystyle f(z)=\sum\limits_{n=0}^\infty a_n(z-z_0)^n

converging for |z-z_0|<r. We know that this function has a derivative, which again has a power series expansion about z_0:

\displaystyle f'(z)=\sum\limits_{n=1}^\infty na_n(z-z_0)^{n-1}

converging in the same radius. And so on, we find arbitrarily high derivatives

\displaystyle f^{(k)}(z)=\sum\limits_{n=k}^\infty\frac{n!}{(n-k)!}(z-z_0)^{n-k}

Now, we can specialize this by evaluating at the central point to find f^{(k)}(z_0)=k!a_k. That is, we have the formula for the power series coefficients:

\displaystyle a_k=\frac{f^{(k)}(z_0)}{k!}

This formula specifies the sequence of coefficients of a power series expansion of a function about a point uniquely in terms of the derivatives of the function at that point. That is, there is at most one power series expansion of any function about a given point.

So in our first example, of moving away from a point and back, the resulting series has the same coefficients we started with. Thus even though we were only assured that the series would converge in a much smaller disk, it actually converges in a larger disk than our formula guaranteed. In fact, this happens a lot: moving from one point to another we actually break new ground and “analytically continue” the function to a larger domain.

That is, we now have two overlapping disks, and each one contains points the other misses. Each disk has a power series expansion of a function. These expansions agree on the parts of the disks that overlap, so it doesn’t matter which rule we use to compute the function in that region. We thus have expanded the domain of our function by choosing different points about which to expand a power series.

September 18, 2008 Posted by | Analysis, Calculus, Power Series | 1 Comment

Derivatives of Power Series

The uniform convergence of a power series establishes that the function it represents must be continuous. Not only that, but it turns out that the limiting function must be differentiable.

A side note here: we define the derivative of a complex function by exactly the same limit of a difference quotient as before. There’s a lot to be said about derivatives of complex functions, but we’ll set the rest aside until later.

Now, to be specific: if the power series \sum\limits_{n=0}^\infty a_n(z-z_0)^n converges for |z-z_0|<r to a function f(z), then f has a derivative f', which itself has a power series expansion

\displaystyle f'(z)=\sum\limits_{n=1}^\infty na_n(z-z_0)^{n-1}

which converges within the same radius r.

Given a point z_1 within r of z_0, we can expand f as a power series about z_1:

\displaystyle f(z)=\sum\limits_{k=0}^\infty b_k(z-z_1)^k

convergent within some radius R of z_1. Then for z in this smaller disk of convergence we have

\displaystyle\frac{f(z)-f(z_1)}{z-z_1}=b_1+\sum\limits_{k=1}^\infty b_{k+1}(z-z_1)^k

by manipulations we know to work for series. Then the series on the right must converge to a continuous function, and continuity tells us that each term vanishes as z approaches z_1. Thus f'(z_1) exists and equals b_1. But our formula for b_1 tells us

\displaystyle f'(z_1)=b_1=\sum\limits_{n=1}^\infty\binom{n}{1}a_n(z_1-z_0)^{n-1}=\sum\limits_{n=1}^\infty na_n(z_1-z_0)^{n-1}

Finally, we can apply the root test again. The terms are now \sqrt[n]{n}\sqrt[n]{|a_n|}. Since the first radical expression goes to {1}, the limit superior is the same as in the original series for f: \frac{1}{r}. Thus the derived series has the same radius of convergence.

Notice now that we can apply the exact same reasoning to f'(z), and find that it has a derivative f''(z), which has a power series expansion

\displaystyle f'(z)=\sum\limits_{n=2}^\infty n(n-1)a_n(z-z_0)^{n-2}

which again converges within the same radius. And so on, we determine that the limiting function of the power series has derivatives of arbitrarily large orders.

September 17, 2008 Posted by | Analysis, Calculus, Power Series | 3 Comments

Translating Power Series

So we know that we can have two power series expansions of the same function about different points. How are they related? An important step in this direction is given by the following theorem.

Suppose that the power series \sum\limits_{n=0}^\infty a_n(z-z_0)^n converges for |z-z_0|<r, and that it represents the function f(z) in some open subset S of this disk. Then for every point z_1\in S there is some open disk around z_1 of radius R contained in S, in which f has a power series expansion

\displaystyle f(z)=\sum\limits_{k=0}^\infty b_k(z-z_1)^k

where

\displaystyle b_k=\sum\limits_{n=k}^\infty\binom{n}{k}a_n(z_1-z_0)^{n-k}

The proof is almost straightforward. We expand

\displaystyle\begin{aligned}f(z)=\sum\limits_{n=0}^\infty a_n(z-z_0)^n=\sum\limits_{n=0}^\infty a_n(z-z_1+z_1-z_0)^n\\=\sum\limits_{n=0}^\infty a_n\sum_{k=0}^n\binom{n}{k}(z-z_1)^k(z_1-z_0)^{n-k}\end{aligned}

Now we need to interchange the order of summation. Strictly speaking, we haven’t established a condition that will allow us to make this move. However, I hope you’ll find it plausible that if this double series converges absolutely, we can adjust the order of summations freely. Indeed, we’ve seen examples of other rearrangements that all go through as soon as the convergence is absolute.

Now we consider the absolute values

\displaystyle\begin{aligned}\sum_{n=0}^\infty|a_n|\sum\limits_{k=0}^n\binom{n}{k}|z-z_1|^k|z_1-z_0|^{n-k}=\sum_{n=0}^\infty|a_n|(|z-z_1|+|z_1-z_0|)^n\\=\sum_{n=0}^\infty|a_n|(z_2-z_0)^n\end{aligned}

Where we set z_2=z_0+|z-z_1|+|z_1-z_0|. But then |z_2-z_0|<R+|z_1-z_0|\leq r, where the last inequality holds because the disk around z_1 of radius R fits within S, which fits within the disk of radius r around z_0. And so this series of absolute values must converge, and we’ll take it on faith for the moment (to be shored up when we attack double series more thoroughly) that we can now interchange the order of summations.

\displaystyle f(z)=\sum\limits_{k=0}^\infty\left(\sum_{n=k}^\infty a_n\binom{n}{k}(z_1-z_0)^{n-k}\right)(z-z_1)^k

This result allows us to recenter our power series expansions, but it only assures that the resulting series will converge in a disk which is contained within the original disk of convergence, so we haven’t necessarily gotten anything new. Yet.

September 16, 2008 Posted by | Analysis, Calculus, Power Series | 5 Comments

Power Series Expansions

Up to this point we’ve been talking about power series like \sum\limits_{n=0}^\infty c_nz^n, where “power” refers to powers of z. This led to us to show that when we evaluate a power series, the result converges in a disk centered at {0}. But what’s so special about zero?

Indeed, we could just as well write a series like \sum\limits_{n=0}^\infty c_n(z-z_0)^n for any point z_0. The result is just like picking up our original power series and carrying it over a bit. In particular, it still converges — and within the same radius — but now in a disk centered at z_0.

So when we have an equation like f=\sum\limits_{n=0}^\infty c_n(z-z_0)^n, where the given series converges within the radius R, we say that the series “represents” f in the disk of convergence. Alternately, we call the series itself a “power series expansion” of f about z_0.

For example, consider the series \sum\limits_{n=0}^\infty\left(\frac{2}{3}\right)^{n+1}\left(z+\frac{1}{2}\right)^n. A simple application of the root test tells us that this series converges in the disk \left|z+\frac{1}{2}\right|<\frac{3}{2}, of radius \frac{3}{2} about the point z_0=-\frac{1}{2}. Some algebra shows us that if we multiply this series by 1-z=\frac{3}{2}-\left(z+\frac{1}{2}\right) we get {1}. Thus the series is a power series expansion of \frac{1}{1-z} about z_0=-\frac{1}{2}.

This new power series expansion actually subsumes the old one, since every point within {1} of {0} is also within \frac{3}{2} of -\frac{1}{2}. But sometimes disks overlap only partly. Then each expansion describes the behavior of the function at values of z that the other one cannot. And of course no power series expansion can describe what happens at a discontinuity.

September 15, 2008 Posted by | Analysis, Calculus, Power Series | 6 Comments

Convergence of Power Series

Now that we’ve imported a few rules about convergent series of complex numbers we can talk about when the series we get from evaluating power series converge or not.

We’ll just consider our power series to be in \mathbb{C}[[X]], because if we have a real power series we can consider each coefficient as a complex number instead. Now we take a complex number z and try to evaluate the power series \sum\limits_{k=0}^\infty c_kX^k at this point. We get a series of complex numbers

\displaystyle\sum\limits_{k=0}^\infty c_kz^k=\lim\limits_{n\rightarrow\infty}\sum\limits_{k=0}^nc_kz^k

by evaluating each polynomial truncation of the power series at z and taking the limit of the sequence. For some z this series may converge and for others it may not. The amazing fact is, though, that we can always draw a circle in the complex plane — |z|=R — within which the series always converges absolutely, and outside of which it always diverges. We’ll say nothing in general about whether it converges on the circle, though.

The tool here is the root test. We take the nth root of the size of the nth term in the series to find \sqrt[n]{|c_nz^n|}=|z|\sqrt[n]{|c_n|}. Then we can pull the z-dependance completely out of the limit superior to write \rho=|z|\limsup\limits_{n\rightarrow\infty}\sqrt[n]{|c_n|}. The root test tells us that if this is less than {1} the series will converge absolutely, while if it’s greater than {1} the series will diverge.

So let’s define \lambda=\limsup\limits_{n\rightarrow\infty}\sqrt[n]{|c_n|}. The root test now says that if |z|<\frac{1}{\lambda} we have absolute convergence, while if |z|>\frac{1}{\lambda} the series diverges. Thus \frac{1}{\lambda} is the radius of convergence that we seek.

Now there are examples of series with all sorts of behavior on the boundary of this disk. The series \sum\limits_{k=0}^\infty z^k has radius of convergence {1} (as we can tell from the above procedure), but it doesn’t converge anywhere on the boundary circle. On the other hand, the series \sum\limits_{k=1}^\infty\frac{1}{k^2}z^k has the same radius of convergence, but it converges everywhere on the boundary circle. And, just to be perverse, the series \sum\limits_{k=1}^\infty\frac{1}{k}z^k has the same radius of convergence but converges everywhere on the boundary but the single point z=1.

August 29, 2008 Posted by | Analysis, Calculus, Power Series | 6 Comments

Follow

Get every new post delivered to your Inbox.

Join 393 other followers