Note: Exponential Differentiation

Steve Trettel
|
Here’s the calculation that shows if you define an exponential as a merely continuous solution to $E(x+y)=E(x)E(y)$ then in fact it is differentiable. Even better we can explicitly calculate and find the derivative, and do so inductively, proving exponentials are smooth.
An exponential function is a continuous nonconstant function $E\colon\mathbb{R}\to\mathbb{R}$ satisfying the law of exponents
$$E(x+y)=E(x)E(y)$$for all $x,y\in\mathbb{R}$.
This is a functional characterization of exponentials; we don’t specify a formula for how to compute the function but instead we specify how the function ought to behave. There are several facts about exponentials we will need, that follow directly from this definition:
- $E(0)=1$
- Exponentials are nonzero.
- Exponentials are monotone.
- Exponentials are convex.
These are proven in a previous note introducing the functional characterization. We also need some general theorems from analysis about such functions
- The difference quotients of a convex function are monotone
- Left and Right-hand limits of monotone functions exist
We do not prove these, but they can be found in many undergraduate analysis texts. Alright lets get on with it!
Let $E(x)$ be an exponential function. Then $E$ is differentiable on the entire real line, and
$$E^\prime(x) = E^\prime(0)E(x)$$First we show that this formula holds so long as $E$ is actually differentiable at zero. Thus, differentiability at a single point is enough to ensure differentiability everywhere and fully determine the formula!
Let $x\in\RR$, and $h_n\to 0$. Then we compute $E^\prime(x)$ by the following limit:
$$E^\prime(x)=\lim \frac{E(x+h_n)-E(x)}{h_n}$$Using the property of exponentials and the limit laws, we can factor an $E(x)$ out of the entire numerator:
$$=\lim \frac{E(x)E(h_n)-E(x)}{h_n}=E(x)\lim \frac{E(h_n)-1}{h_n}$$But, $E(0)=1$ so the limit here is actually the *derivative of $E$ at zero$!
$$E^\prime(x)=E(x)E^\prime(0)$$Next, we tackle the slightly more subtle problem of showing that $E$ is in fact differentiable at zero. This is tricky because all we have assumed is that $E$ is continuous and satisfies the law of exponents: how are we going to pull differentiability out of this? The trick is two parts (1) show the right and left hand limits defining the derivative exist, and (2) show they’re equal.
STEP 1: Show that the left and right hand limits defining the derivative exist: $E$ is convex so the difference quotient is monotone increasing, and so the limit $\lim_{x\to 0^-}$ exists (as a sup) and $\lim_{x\to 0^+}$ exists (as an inf).
STEP2: Now that we know each of these limits exist, let’s show they are equal using the definition:
To compute the lower limit, we can choose any sequence approaching $0$ from below: let $h_n$ be a positive sequence with $h_n\to 0$, then $-h_n$ will do:
$$\lim_{h\to 0^-}\frac{E(h)-1}{h}=\lim \frac{E(-h_n)-1}{-h_n}$$A
We can calculate $E(-h_n)$ by the law of exponents: we know $E(0)=1$ so $E(h_n-h_n)=1$, but this implies $E(h_n)E(-h_n)=1$ so $E(-h_n)=1/E(h_n)$. Thus
$$ \begin{align*} \lim \frac{E(-h_n)-1}{-h_n}&=\lim \frac{\frac{1}{E(h_n)}-1}{-h_n}\\ &=\lim\frac{1-E(h_n)}{-h_n}\frac{1}{E{h_n}}\\ &=\lim \frac{E(h_n)-1}{h_n}\frac{1}{E(h_n)} \end{align*} $$But, since $E$ is continuous (by definition) and $E(0)=1$ the limit theorems imply
$$\lim \frac{1}{E(h_n)}=\frac{1}{\lim E(h_n)}=\frac{1}{E(\lim h_n)}=\frac{1}{E(0)}=1$$Thus,
$$ \begin{align*} &\lim \left(\frac{E(h_n)-1}{h_n}\frac{1}{E(h_n)}\right)\\&= \left(\lim \frac{E(h_n)-1}{h_n}\right)\left(\lim\frac{1}{E(h_n)}\right)\\ &=\lim \frac{E(h_n)-1}{h_n}\\ \end{align*} $$But this last limit evaluates exactly to the limit from above since $h_n>0$ and $h_n\to 0$. Stringing all of this together, we finally see
$$\lim_{h\to 0^-}\frac{E(h)-1}{h}=\lim_{h\to 0^+}\frac{E(h)-1}{h}$$So both one sided limits exist and are equal, which implies the entire limit exists: $E$ is differentiable at $0$.
Consequences
This theorem tells us that the exponential functions have a remarkable property: they are their own derivatives, up to a constant multiple. This has several useful consequences:
Also, because we know how to differentiate constant multiples, this lets us calculate arbitrary derivatives of exponentials:
$$E^{\prime\prime}(x)=\left[E^\prime(x)\right]^\prime=\left[kE(x)\right]^\prime = k[E(x)]^\prime = k^2 E(x)$$Continuing this inductively we get arbitrary derivatives $E^{(n)}(x)=k^n E(x)$, where we have written $k$ for the value of $E^\prime(0)$
Next we see that the theory of exponentiation alone does not further the possible values of $k$:
Let $E$ be an exponential and $\ell(x)=kx$ be linear. Since $\ell$ is continuous and $\ell(x+y)=k(x+y)=kx+ky=\ell(x)+\ell(y)$, the composition $E\circ \ell$ is a continuous nonconstant solution to the Law of Exponents:
$$(E\circ\ell)(x+y)=E(\ell(x+y))=E(\ell(x)+\ell(y))=E(\ell(x))E(\ell(y))=(E\circ \ell)(x)(E\circ\ell)(y)$$Let $E$ be an exponential function, and $k$ nonzero. Then define $\ell(x)=\frac{k}{E^\prime(0)}$, and note the composition $\mathcal{E}=E\circ \ell$ is an exponential by our previous work. But differentiating this with the chain rule yields
$$ \begin{align} \mathcal{E}^\prime(x)&=(E\circ \ell)^\prime(x)\\ &=E^\prime(\ell(x))\ell^\prime(x)\\ &= E^\prime(0)E(\ell(x)) \frac{k}{E^\prime(0)}\\ &= k E(\ell(x))\\ &= k\mathcal{E}(x) \end{align} $$So $\mathcal{E}$ is an exponential whose derivative is $k$ times itself, as required.
While the functional equation alone did not provide us any means of distinguishing between different exponential functions, differentiation selects a single best, or simplest exponential out of the lot: the one where that constant multiple is just $1$! So long as there is any exponential at all, this must also exist, by the above. So we give it a name
Application: Uniqueness of Solutions to $y^\prime =y$
Above we’ve seen that all exponential functions are differentiable and satisfy $E^\prime(x)=kE(x)$ for some nonzero $k$. But we can go beyond this, and actually show that every solution to this differential equation with $E(0)=1$ is an exponential. Of course, this would follow trivially if we had the existence and unqiueness for first order ODEs handy, but one doesn’t usually have that by this point in an analysis course, so I’ll just record the independent argument here, specialized to the case $\exp(x)$.
If any exponential exists we know that $\exp$ exists by the above. Let $f$ be any solution to our differential equation, so $f^\prime(x)=f(x)$ and $f(0)=1$. The trick is to consider the function $g(x)=f(x)/\exp(x)$ and show its constant. Taking the derivative,
$$g^\prime(x)=\left(\frac{f(x)}{\exp(x)}\right)^\prime =\frac{f^\prime(x)\exp(x)-f(x)\exp^\prime(x)}{\exp(x)^2}$$using the fact that $\exp^\prime =\exp$ by above, and $f^\prime = f$ by assumption, the numerator is zero:
$$f^\prime(x)\exp(x)-f(x)\exp^\prime(x)=f(x)\exp(x)-f(x)\exp(x)=0$$Thus $g^\prime(x)=0$ for all $x$, and $g$ is constant. To find the value of $g$ we need only evaluate it at a point
$$g(0)=\frac{f(0)}{\exp(0)}=\frac{1}{1}=1$$Thus $f(x)/\exp(x)=1$ for all $x$, so
$$f(x)=\exp(x)$$