Antidifferentiation

Recall that an antiderivative of a function $f$ is just some function $F$ whose derivative equals $f$. As a consequence of the fundamental theorem of calculus, we will be deeply invested in learning how to find such antiderivatives, and a good spot to start is by looking back at what we know from differential calculus, and seeing which rules or techniques are easy to “undo”.

Differential calculus is built by learning the derivatives of a small number of specific functions (powers, exponentials, logs, and trigonometric), and then learning ways to differentiate combinations of these (sums, differences, constant multiples, products, quotients, and compositions).

Elementary Antiderivatives

One means of getting our hands on some antidifferentiation tehcinques is simply to “read differentiation rule in reverse”! For a first example; since we know the derivative of $x^2$ is $2x$, this immediately implies that an antiderivative of $2x$ is $x^2$ (recall of course that there are others, the indefinite integral of $2x$ is $x^2+C$). Dividing both functions involved here by two, its natural to suggest that this implies that an antiderivative of $x$ is $\frac{1}{2}x^2$: and indeed, this can be directly checked by differentiation: $$\left(\frac{1}{2}x^2\right)^\prime =x\hspace{0.5cm}\implies\hspace{0.5cm} \int x dx=\frac{1}{2}x^2+C$$

We can carry out this reasoning in greater generality: since the derivative of $x^n$ is $nx^{n-1}$, we see that an antiderivative of $x^{n-1}$ is $\frac{1}{n}x^n$, which is more commonly written as below, all the indices up by one:

$$\int x^n dx =\frac{1}{n+1}x^{n+1}+C$$

This formula makes sense all values of $n$ except $n=-1$, where the right hand side suggests we should do the unthinkable, and divide by zero! We should take this as a sign that we probably need to treat this case, of $x^{-1}=1/x$ separately, and step back to think about it from first principles. An antiderivative of $1/x$ is just some function $F$ where $F'(x)=1/x$ - and we know from differential calculus that the natural log (of the absolute value of $x$) satisfies this exactly! Thus,

$$\int\frac{1}{x}dx=\ln|x|+C$$

Moving on, we may use similar reasoning to uncover the antiderivatives of familiar exponential and trigonometric functions. First off, since $e^x$ is its own derivative, it is also its own antiderivative: $$\int e^x dx= e^x+C$$

And since the differentiation rule for a general exponential with base $a$ leads to multiplication by $a$’s natural log, $(a^x)^\prime = a^x\ln(a)$, the antidifferentation rule for $a^x$ involves division by $\ln(a)$: $$\int a^xdx = \frac{1}{\ln a}a^x+C$$

Finally, since the derivative of sin is cosine, we have $$\int \cos x dx = \sin x+C$$ and $(\cos x)^\prime = -\sin$ implies, as one can easily check, that $$\int\sin xdx = -\cos x +C$$

All other trigonometric integration formulas are direct consequences of these, so there’s no need to memorize them now: we will be able to derive them all from basic principles in short order! So this is in some sense the end of our list of elementary antiderivatives; the basic building blocks from which all others will be built.

Antiderivatives of Sums and Constant Multiples

In differential calculus, there is a very simple rule for how to differentiate a constant multiple of a function: if $f(x)$ is an arbitrary function and $c$ is any real number, then $$(kf(x))^\prime = k f^\prime(x)$$ That is, we can ‘pull the constant out’ of the derivative, and work only on differentiating $f$, throwing $k$ back in at the end. (This is easy to justify to yourself if you write down the limit definition of the derivative: you can literally pull the $k$ right out of the numerator of the limit!)

This suggests an analogous rule for antidifferentation: if $f$ is any function and $k$ is a constant, then $kF$ is an antiderivative of $kf$, whenever $F$ is an antiderivative of $f$. This rule is particularly easy to remember when written as an equality of indefinite integrals: $$\int kf dx= k\int fdx$$

Similarly, there is a very nice rule for differentiating sums of functions: you can just differentiate each term of the sum separately! In symbols, $$(f+g)^\prime = f^\prime + g^\prime$$ This also comes with a twin antidifferentiation rule: $$\int (f+g) dx = \int f dx+\int g dx$$

The other two famous rules for differentation are the chain rule and product/quotient rule, for differenting compositions and products of functions. These too have antidifferentiation analogs (called $u$-substitution and integration by parts), but their application is more involved so each will be treated with its own lesson in due time.

Examples:

Putting all these basic rules together makes it possible already to anti differentiate some complicated looking expressions! Here’s a couple examples to try:

$$f(x)=x^\pi+\pi^x$$

$$f(x)=2x^3-3x^2-\frac{\cos(x)}{6}$$

$$f(x)=\ln(x^4)$$