# The Unapologetic Mathematician

## Antiderivatives

One of the consequences of the mean value theorem we worked out was that two differentiable functions $f$ and $g$ on an interval $(a,b)$ differ by a constant if and only if their derivatives are the same: $f'(x)=g'(x)$ for all $x\in(a,b)$. Now let’s turn this around the other way.

We start with a function $f$ on an interval $(a,b)$ and define an “antiderivative” of $f$ to be a function $F$ on the same interval such that $F'(x)=f(x)$ for $x\in(a,b)$. What the above conclusion from the mean value theorem shows us is that there’s only one way any two solutions could differ. That is if $F$ is some particular antiderivative of $f$ then any other antiderivative $G$ satisfies $G(x)=F(x)+C$ for some real constant $C$. So the hard bit about antiderivatives is all in finding a particular one, since the general solution to an antidifferentiation problem just involves adding an arbitrary constant corresponding to the constant we lose when we differentiate.

Some antiderivatives we can pull out right away. We know that if $F(x)=x^n$ then $F'(x)=nx^{n-1}$. Thus, turning this around, we find an antiderivative of $f(x)=x^n=\frac{x^{n+1}}{n+1}$, except if $n=-1$, because then we’ll have to divide by zero. We’ll figure out what to do with this exception later.

We can also turn around some differentiation rules. For instance, since $\frac{d}{dx}\left[f(x)+g(x)\right]=f'(x)+g'(x)$ then if $F$ is an antiderivative of a function $f$ and $G$ an antiderivative of $g$ then $F+G$ is an antiderivative of $f+g$. Similarly, the differentiation rule for a constant multiple tells us that $cF$ is an antiderivative of $cf$ for any real constant $c$.

Between these we can handle antidifferentiation of any polynomial $P(x)$. Each term of the polynomial is some constant times a power of $x$, so the constant multiple rule and the rule for powers of $x$ gives us an antiderivative for each term. Then we can just add these antiderivatives all together. We also only have one arbitrary constant to add since we can just add together the constants for each term to get one overall constant for the whole polynomial.

January 28, 2008 Posted by | Analysis, Calculus | 5 Comments

There’s a new paper out on the arXiv discussing higher-dimensional linking integrals, by two graduate students at the University of Pennsylvania. I don’t have time to really go through it right now, but at a first scan I’m really not sure what they’ve done here. It seems they’re just taking the regular Gauss integral and doing the exact same thing for higher-dimensional spheres, although in a way that’s so loaded down with notation that it obscures the fact that it’s the exact same idea.

Some people like results that are more computationally focused, and some (like me) prefer to lay bare the structure of the concepts, and derive a computational framework later. It may be that these authors are just more the former than the latter. Anyhow, I’m not certain how original it is, but my own work is shot through with “wait, you mean nobody’s written that up yet?” If they’ve found one of these obvious niches that nobody has gotten around to mining, more power to them.

January 28, 2008 Posted by | Knot theory | 13 Comments