Derivatives of Logarithmic and Exponential Functions

In this note, we study derivatives of logarithmic and exponential functions.

Derivatives of Logarithmic Functions

First recall that \begin{equation}\label{eq:euler}\lim_{t\to 0}(1+t)^{\frac{1}{t}}=e\end{equation}\begin{align*}\frac{d}{dx}\ln x&=\lim_{h\to 0}\frac{\ln(x+h)-\ln x}{h}\\&=\lim_{h\to 0}\frac{1}{h}\ln\left(\frac{x+h}{x}\right)\\&=\frac{1}{x}\lim_{h\to 0}\ln\left(1+\frac{h}{x}\right)^{\frac{x}{h}}\\&=\frac{1}{x}\lim_{t\to 0}\ln(1+t)^{\frac{1}{t}}\\&=\frac{1}{x}\end{align*} with t=\frac{h}{x}.\begin{equation}\label{eq:dln}\frac{d}{dx}\ln x=\frac{1}{x}\end{equation} Using the change of base formula \log_ax=\frac{\ln x}{\ln a}, we obtain \begin{equation}\label{eq:dlog}\frac{d}{dx}\log_ax=\frac{1}{x\ln a}\end{equation}

Derivatives of Exponential Functions

We can find the derivative of the natural exponential function y=e^x using the relationship x=\ln y and implicit differentiation. Differentiating x=\ln y with respect to x we obtain 1=\frac{1}{y}\frac{dy}{dx} i.e. \frac{dy}{dx}=y=e^x. Hence \begin{equation}\label{eq:dnatexp}\frac{d}{dx}e^x=e^x\end{equation} Note that a^x=e^{x\ln a}. So by the chain rule we find \frac{d}{dx}a^x=\frac{d}{dx}e^{x\ln a}=e^{x\ln a}\ln a=a^x\ln a Hence\begin{equation}\label{label:dexp}\frac{d}{dx}a^x=a^x\ln a\end{equation}

The Power Rule (General Form)

Let us consider x^n for any x>0 and any real number n. As we have seen above x^n=e^{n\ln x} so by the chain rule \frac{d}{dx}x^n=\frac{d}{dx}e^{n\ln x}=e^{n\ln x}\frac{n}{x}=nx^{n-1} This completes the proof of the general power rule.

Logarithmic Differentiation

The derivatives of functions involving products, quotients, and powers may be found more easily (quickly) by taking the natural logarithm of such functions before differentiating. This allows us to break a complicated function into simpler pieces using properties of the natural logarithm. This whole process, which is called logarithmic differentiation, makes differentiation much easier and quicker.

Example. Use logarithmic differentiation to find the derivative of y=\frac{x\sqrt{x^2+1}}{(x+1)^{\frac{2}{3}}}.

Solution. \begin{align*}\ln y&=\ln \frac{x\sqrt{x^2+1}}{(x+1)^{\frac{2}{3}}}\\&=\ln x+\frac{1}{2}\ln(x^2+1)-\frac{2}{3}\ln(x+1)\end{align*} Differentiating with respect to x, \frac{1}{y}\frac{dy}{dx}=\frac{1}{x}+\frac{x}{x^2+1}-\frac{2}{3(x+1)} Therefore, \frac{dy}{dx}=\left[\frac{1}{x}+\frac{x}{x^2+1}-\frac{2}{3(x+1)}\right]\frac{x\sqrt{x^2+1}}{(x+1)^{\frac{2}{3}}}

Example. Let y=x^x, x>0. Find \frac{dy}{dx}.

Solution 1. y=x^x=e^{x\ln x} and by the chain rule we obtain \frac{dy}{dx}=x^x(1+\ln x)

Solution 2. Use logarithmic differentiation. \ln y=x\ln x and differentiating this with respect to x, we have \frac{1}{y}\frac{dy}{dx}=1+\ln x Hence, \frac{dy}{dx}=x^x(1+\ln x)

Alternative Approach

In the earlier approach we started out with e^x and regarded \ln x as its inverse function. It can also be done the other way around, namely we first define \ln x and regard e^x as its inverse function. The natural logarithmic function \ln x can be defined by \begin{equation}\label{eq:natlog}\ln x=\int_1^x\frac{1}{t}dt,\ x>0\end{equation} The number x that satisfies the equation \ln x=1 is denoted by e. All properties of natural logarithm can be derived from definition \eqref{eq:natlog}. Also from definition \eqref{eq:natlog}, we obtain \eqref{eq:dln} by the Fundamental Theorem of Calculus. Using \eqref{eq:dln} one can show the limit \lim_{x\to 0}(1+x)^{\frac{1}{x}}=e

Proof. Let f(x)=\ln x. Then f'(x)=\frac{1}{x} and so f'(1)=1. On the other hand, \begin{align*}f'(1)&=\lim_{x\to 0}\frac{f(1+x)-f(1)}{x}\\&=\lim_{x\to 0}\frac{\ln(1+x)}{x}\\&=\lim_{x\to 0}\ln(1+x)^{\frac{1}{x}}\\&=\ln[\lim_{x\to 0}(1+x)^{\frac{1}{x}}]\end{align*} Therefore, \lim_{x\to 0}(1+x)^{\frac{1}{x}}=e

Remark. By substituting y=\frac{1}{x}, e=\lim_{y\to\infty}\left(1+\frac{1}{y}\right)^y

Remark. An alternative definition of e is as an infinite series e=1+\frac{1}{1!}+\frac{1}{2!}+\frac{1}{3!}+\cdots For details see here.

Derivatives of Trigonometric Functions

In this note, we study derivatives of trigonometric functions y=\sin x, y=\cos x, y=\sec x, y=\csc x, y=\tan x, and y=\cot x.  First we calculate the derivative of y=\sin x. \begin{align*}\frac{d}{dx}\sin x&=\lim_{h\to 0}\frac{\sin(x+h)-\sin x}{h}\\&=\lim_{h\to 0}\frac{\sin x\cos h+\cos x\sin h-\sin x}{h}\\&=\lim_{h\to 0}\left[\sin x\frac{\cos h-1}{h}+\cos x\frac{\sin h}{h}\right]\end{align*} Recall that \lim_{h\to 0}\frac{\cos h -1}{h}=0 and \lim_{h\to 0}\frac{\sin h}{h}=1. Hence we obtain \begin{equation}\label{eq:dsin}\frac{d}{dx}\sin x=\cos x\end{equation} In a similar manner we can also obtain \begin{equation}\label{eq:dcos}\frac{d}{dx}\cos x=-\sin x\end{equation} Using the reciprocal rule (baby quotient rule) along with \eqref{eq:dsin} and \eqref{eq:dcos}, we find the derivatives of y=\sec x, y=\csc x as \begin{align}\label{eq:d\sec}\frac{d}{dx}\sec x&=\sec x\tan x\\\label{eq:dcsc}\frac{d}{dx}\csc x&=-\csc x\cot x\end{align} Finally using the quotient rule along with \eqref{eq:dsin} and \eqref{eq:dcos}, we find the derivatives of y=\tan x, y=\cot x as \begin{align}\label{eq:d\tan}\frac{d}{dx}\tan x&=\sec^2 x\\\label{eq:dcot}\frac{d}{dx}\cot x&=-\csc^2 x\end{align}

Alternating Series, Absolute and Conditional Convergence

The Alternating Series Test

The alternating series \sum_{k=1}^\infty (-1)^{k+1}a_k converges provided:

  1. 0<a_{k+1}\leq a_k for all k=1,2,3,\cdots i.e. \{a_k\} is a decreasing sequence.
  2. \lim_{k\to\infty}a_k=0.

Example. The alternating harmonic series \sum_{k=1}^\infty\frac{(-1)^{k+1}}{k} converges.

Example. The alternating series \sum_{k=1}^\infty(-1)^{k+1}\frac{k+1}{k} diverges because \lim_{k\to\infty}\frac{k+1}{k}=1\ne 0.

Example. The alternating series \sum_{k=2}^\infty(-1)^k\frac{\ln k}{k} converges because

  1. \left\{\frac{\ln k}{k}\right\} is decreasing for all n\geq 3. (Let f(x)=\frac{\ln x}{x}. Then f'(x)=\frac{1-\ln x}{x^2}<0 for all x>e=2.7182818284590\cdots.)
  2. \lim_{k\to\infty}\frac{\ln k}{k}=0.

Remainder in Alternating Series

Let S=\sum_{k=1}^\infty(-1)^{k+1}a_k=a_1-a_2+a_3-a_4+\cdots. Then we see that the distribution of its partial sums would be like the following figure.

From the figure, we obtain the inequality \begin{equation}\label{eq:altser}|R_n|=|S-S_n|\leq|S_{n+1}-S_n|=a_{n+1}\end{equation} The inequality \eqref{eq:altser} can serve as an estimate for the error (remainder) |R_n| whose error bound is given by a_{n+1}.

Example.

  1. How many terms of the series \ln 2=1-\frac{1}{2}+\frac{1}{3}-\frac{1}{4}+\cdots=\sum_{k=1}^\infty\frac{(-1)^{k+1}}{k} are required to approximate the value of the series with a remainder less than 10^{-6}?
  2. If n=9 terms of the series \sum_{k=1}^\infty\frac{(-1)^k}{k!}=e^{-1}-1 are summed, what is the maximum error committed in approximating the value of the series?

Solution.

  1. |R_n|\leq a_{n+1}=\frac{1}{n+1}<10^{-6} so n+1>1000000 i.e. n\geq 1000000.
  2. |R_9|\leq\frac{1}{10!}\approx 2.8\times 10^{-7}.

Absolute and Conditional Convergence

Assume that \sum_{k=1}^\infty a_k converges. \sum_{k=1}^\infty a_k is said to converge absolutely if \sum_{k=1}^\infty |a_k| converges. Otherwise, \sum_{k=1}^\infty a_k converges conditionally.

Example. The alternating harmonic series \sum_{k=1}^\infty\frac{(-1)^{k+1}}{k} converges conditionally.

Theorem. If \sum_{k=1}^\infty |a_k| converges, then so does \sum_{k=1}^\infty a_k. That is, absolute convergence implies convergence. However, the converse need not be true as seen in the example above.

Proof. \begin{align*}\sum_{k=1}^\infty a_k&=\sum_{k=1}^\infty(a_k+|a_k|-|a_k|)\\&=\sum_{k=1}^\infty(a_k+|a_k|)-\sum_{k=1}^\infty|a_k|\end{align*} Since 0\leq a_k+|a_k|\leq 2|a_k|, \sum_{k=1}^\infty(a_k+|a_k|) converges. Therefore, \sum_{k=1}^\infty a_k converges.

Example. Determine whether each of the following series diverges, converge absolutely, or converge conditionally.

  1. \sum_{k=1}^\infty\frac{(-1)^{k+1}}{\sqrt{k}}
  2. \sum_{k=1}^\infty\frac{(-1)^{k+1}}{\sqrt{k^3}}
  3. \sum_{k=1}^\infty\frac{\sin k}{k^2}
  4. \sum_{k=1}^\infty\frac{(-1)^kk}{k+1}

Solution.

  1. By the alternating series test, the series converges. However, \sum_{k=1}^\infty\frac{1}{\sqrt{k}} is a p-series with p=\frac{1}{2}<1, so it diverges. Hence, \sum_{k=1}^\infty\frac{(-1)^{k+1}}{\sqrt{k}} converges conditionally.
  2. \sum_{k=1}^\infty\frac{1}{\sqrt{k^3}} is a p-series with p=\frac{3}{2}>1, so it converges. Therefore, \sum_{k=1}^\infty\frac{(-1)^{k+1}}{\sqrt{k^3}} converges absolutely.
  3. |\sin k|\leq 1, so \frac{|\sin k|}{k^2}\leq\frac{1}{k^2}. Since \sum_{k=1}^\infty\frac{1}{K^2} converges, so does \sum_{k=1}^\infty\frac{|\sin k|}{k^2} by the comparison test. Therefore, \sum_{k=1}^\infty\frac{\sin k}{k^2} converges absolutely.
  4. \lim_{k\to\infty}\frac{k}{k+1}=1\ne 0 so the alternating series diverges.

The Ratio, Root and Comparison Tests

d’Alembert-Cauchy Ratio Test

The following d’Alembert-Cauchy ratio test is one of the easiest to apply and is widely used.

Theorem (d’Alembert-Cauchy Ratio Test). Suppose that \sum_{n=1}^\infty a_n is a series with positive terms.

  1. If \lim_{n\to\infty}\frac{a_{n+1}}{a_n}<1 then \sum_{n=1}^\infty a_n converges.
  2. If \lim_{n\to\infty}\frac{a_{n+1}}{a_n}>1 then \sum_{n=1}^\infty a_n diverges.
  3. If \lim_{n\to\infty}\frac{a_{n+1}}{a_n}=1 then \sum_{n=1}^\infty a_n then the convergence is indeterminant, i.e., the ratio test provides no information regarding the convergence of the series \sum_{n=1}^\infty a_n.

Example. Test \sum_{n=1}^\infty\frac{n}{2^n} for convergence.

Solution. \begin{align*}\lim_{n\to\infty}\frac{a_{n+1}}{a_n}&=\lim_{n\to\infty}\frac{\frac{n+1}{2^{n+1}}}{\frac{n}{2^n}}\\&=\lim_{n\to\infty}\frac{n+1}{2n}\\&=\frac{1}{2}<1\end{align*} Hence by the ratio test the series converges.

Example. Test the convergence of the series \sum_{n=1}^\infty\frac{n^n}{n!}.

Solution.
\begin{align*} \lim_{n\to\infty}\frac{a_{n+1}}{a_n}&=\lim_{n\to\infty}\left(1+\frac{1}{n}\right)^n\\ &=e>1. \end{align*}
Hence, the series diverges.

Remark. There is an easier way to show the divergence of the series \sum_{n=1}^\infty\frac{n^n}{n!}.

Note that
a_n=\frac{n^n}{n!}=\frac{n\cdot n\cdot n\cdots n}{1\cdot 2\cdot 3\cdots n}\geq n.
This implies that \lim_{n\to\infty}a_n=\infty. Hence by the divergence test the series diverges.

Cauchy Root Test

Theorem (Cauchy Root Test). Suppose that \sum_{n=1}^\infty a_n be a series with positive terms.

  1. If \lim_{n\to\infty}\root n\of{a_n}=r<1 then \sum_{n=1}^\infty a_n converges.
  2. If \lim_{n\to\infty}\root n\of{a_n}=r> 1 then \sum_{n=1}^\infty a_n diverges.
  3. If \lim_{n\to\infty}\root n\of{a_n}=r=1 then the test fails, i.e., the root test is inclusive.

Example. Test the convergence of the series \sum_{n=1}^\infty\left(\frac{2n+3}{3n+2}\right)^n.

Solution. \begin{align*}\lim_{n\to\infty}\root n\of{a_n}&=\lim_{n\to\infty}\root n\of{\left(\frac{2n+3}{3n+2}\right)^n}\\&=\lim_{n\to\infty}\frac{2n+3}{3n+2}\\&=\frac{2}{3}<1\end{align*}Hence by the root test the series converges.

Comparison Test

Theorem (Comparison Test). Suppose that \sum_{n=1}^\infty a_n and \sum_{n=1}^\infty b_n be series with positive terms.

  1. If \sum_{n=1}^\infty b_n converges and a_n\leq b_n for all n, then \sum_{n=1}^\infty a_n also converges.
  2. If \sum_{n=1}^\infty b_n diverges and b_n\leq a_n for all n, then \sum_{n=1}^\infty a_n also diverges.

Remark. For a convergent series we have the geometric series, whereas the harmonic series will serve as a divergent series. As other series are identified as either convergent or divergent, they may be used for the known series in this comparison test.

Example. Determine whether the series \sum_{n=1}^\infty\frac{5}{2n^2+4n+3} converges.

Solution. Notice that \frac{5}{2n^2+4n+3}<\frac{5}{n^2} for all n. Since \sum_{n=1}\frac{1}{n^2} converges (it is a p-series with p=2), by the comparison test the series converges.

Example. Test the series \sum_{n=1}^\infty\frac{n^3}{2n^4-1} for convergence or divergence.

Solution. 2n^4-1<2n^4 so \frac{n^3}{2n^4-1}>\frac{n^3}{2n^4}=\frac{1}{2n}. Since the harmonic series \sum_{n=1}^\infty\frac{1}{n} diverges, the series diverges.

Example. Test the series \sum_{n=2}^\infty\frac{\ln n}{n} for convergence or divergence.

Solution. \left(\frac{\ln x}{x}\right)’=\frac{1-\ln x}{x^2}<0 on (e,\infty) i.e. \frac{\ln n}{n}>\frac{1}{n} for all n\geq 3. Since \sum_{n=1}^\infty\frac{1}{n} diverges (the harmonic series, also p-series with p=1), by the comparison test, the series diverges.

Example. Test the series \sum_{n=2}^\infty\frac{\ln n}{n^3} for convergence or divergence.

Solution. As seen in the following figure, \ln n<n for all n\geq 2.

The graphs of y=ln(x) (in red) and y=x (in blue).

So \frac{\ln n}{n^3}<\frac{n}{n^3}=\frac{1}{n^2}. Since \sum_{n=1}^\infty\frac{1}{n^2} converges (p-series with p=2>1), \sum_{n=2}^\infty\frac{\ln n}{n^3} also converges.

Example (The p series). Let p\leq 1 Then
\frac{1}{n}<\frac{1}{n^p} for all n, so by the Comparison Test
\sum_{n=1}^\infty\frac{1}{n^p} is divergent for all p\leq 1.

The Limit Comparison Test

The limit comparison Test is a variation of the comparison test.

Theorem (The Limit Comparison Test). Suppose that \sum_{n=1}^\infty a_n (this is the test subject) and \sum_{n=1}^\infty b_n (this is the series you know its convergence or divergence) are series with positive terms. Let L=\lim_{n\to\infty}\frac{a_n}{b_n}. Then the following holds.

  1. If 0<L<\infty, then either both series converge or both diverge.
  2. If L=0 and \sum_{n=1}^\infty b_n converges, then \sum_{n=1}^\infty a_n converges.
  3. If L=\infty and \sum_{n=1}^\infty b_n diverges, then \sum_{n=1}^\infty a_n.

The limit comparison test is inconclusive otherwise.

Remark. Just like the comparison test the hardest part of using the limit comparison test is choosing a right series for \sum_{n=1}^\infty b_n and unfortunately there is no systematic way of choosing a right one. It just depends on the given series. It could be a geometric series as you will see in an example below. For certain types of series, a good candidate for b_n is \frac{1}{n^p} from the p-series with an appropriate p-value.

Example. Test the series \sum_{n=1}^\infty\frac{1}{2^n-1} for convergence or divergence.

Solution. Considering that a_n=\frac{1}{2^n-1} and the geometric series \sum_{n=1}^\infty\frac{1}{2^n} converges, it would be reasonable to try b_n=\frac{1}{2^n}. \begin{align*}\lim_{n\to\infty}\frac{a_n}{b_n}&=\lim_{n\to\infty}\frac{2^n}{2^n-1}\\&=\lim_{n\to\infty}\frac{1}{1-\frac{1}{2^n}}\\&=1\end{align*} Since \sum_{n=1}^\infty\frac{1}{2^n} converges, so should \sum_{n=1}^\infty\frac{1}{2^n-1} by the limit comparison test.

Example. Test the series \sum_{n=1}^\infty\frac{1}{\sqrt{n^2+1}} for convergence or divergence.

Solution. The dominant part of a_n=\frac{1}{\sqrt{n^2+1}} is \frac{1}{\sqrt{n^2}}=\frac{1}{n} so we choose b_n=\frac{1}{n}. Then \begin{align*}\lim_{n\to\infty}\frac{a_n}{b_n}&=\lim_{n\to\infty}\frac{n}{\sqrt{n^2+1}}\\&=\lim_{n\to\infty}\frac{1}{\sqrt{1+\frac{1}{n^2}}}\\&=1\end{align*} Since \sum_{n=1}^\infty\frac{1}{n} diverges, so should \sum_{n=1}^\infty\frac{1}{\sqrt{n^2+1}} by the limit comparison test.

Example. Test the series \sum_{n=1}^\infty\frac{n^4-2n^2+3}{2n^6-n+5} for convergence or divergence.

Solution. The dominant part of a_n is \frac{n^4}{n^6}=\frac{1}{n^2} so we choose b_n=\frac{1}{n^2}. Then \frac{\frac{n^4-2n^2+3}{2n^6-n+5}}{\frac{1}{n^2}}=\frac{n^6-2n^4+3n^2}{2n^6-n+5}\to\frac{1}{2} as n\to\infty. Since \sum_{n=1}^\infty\frac{1}{n^2} converges, so does the given series by the limit comparison test.

Example. Test the series \sum_{n=1}^\infty\frac{\ln n}{n^2} for convergence or divergence.

Solution. In this case, we try the p-series but we don’t know what p-value may work. To figure it out, let b_n=\frac{1}{n^p}. Then \frac{a_n}{b_n}=\frac{\frac{\ln n}{n}}{\frac{1}{n^p}}=\frac{\ln n}{n^{2-p}}. If p\geq 2 then \lim_{n\to\infty}\frac{a_n}{b_n}=\infty but \sum_{n=1}^\infty b_n converges so the test is inconclusive. This means that p<2. Now, using the L’Hôpital’s rule we get \lim_{n\to\infty}\frac{a_n}{b_n}=\lim_{n\to\infty}\frac{1}{(2-p)n^{2-p}}=0 If p\leq 1 then \sum_{n=1}^\infty b_n diverges so the test would be inconclusive. This would leave us the condition 1<p<2 for the limit comparison test to work. This means that for any value of 1<p<2 the limit comparison test will tell us that the series \sum_{n=1}^\infty\frac{\ln n}{n^2} converges. For instance, let us choose p=\frac{3}{2}. Then \frac{a_n}{b_n}=\frac{\ln n}{\sqrt{n}}\to 0 as n\to\infty. Since \sum_{n=1}^\infty\frac{1}{n^{\frac{3}{2}}} converges, the series converges.

Remark. Doing the same analysis we did in the example above, we can also see why using the dominant part of a_n worked out in some earlier examples. For instance, consider the series \sum_{n=1}^\infty\frac{n^4-2n^2+3}{2n^6-n+5} that we discussed earlier. Again a_n=\frac{n^4-2n^2+3}{2n^6-n+5} and let b_n=\frac{1}{n^p}. An appropriate p-value is yet to be determined. Now \frac{a_n}{b_n}=\frac{n^{p-2}-2n^{p-4}+3n^{p-6}}{2-\frac{1}{n^5}+\frac{5}{n^6}}. First, if p\leq 1, the p-series \sum_{n=1}^\infty\frac{1}{n^p} diverges but \lim_{n\to\infty}\frac{a_n}{b_n}=0 so the test is inconclusive and hence p>1 in which case the p-series converges. If p>2 then \lim_{n\to\infty}\frac{a_n}{b_n}=\infty which makes the test inconclusive. Therefore we see that 1<p\leq 2. p=2 is what we get from the dominant part \frac{n^4}{n^6} of a_n. But that is not the only choice. You can choose any 1<p\leq 2 in order for the test to work, for example your could’ve chosen p=\frac{3}{2} in which case \lim_{n\to\infty}\frac{a_n}{b_n}=0. The limit comparison test says then the series converges.

Cauchy-Maclaurin Integral Test

Theorem (Cauchy-Maclaurin Integral Test)

Let f(x) be a continuous, positive, decreasing function on [1,\infty) in which f(n)=a_n. Then \sum_{n=1}^\infty a_n converges if \int_1^\infty f(x)dx is finite and diverges if the integral is infinite.

Proof. Using the left-end point method as seen in Figure 1

Figure 1. Integral Test

we see that a_1+a_2+\cdots+a_{n-1}\geq \int_1^nf(x)dx This means that if \int_1^{\infty}f(x)dx is infinite, \sum_{n=1}^\infty a_n diverges. Now using the right-end point method as seen in Figure 2

Figure 2. Integral Test

we see that a_2+a_3+\cdots+a_n\leq\int_1^n f(x)dx This means that if \int_1^\infty f(x)dx is finite, then \sum_{n=1}^\infty a_n converges. This completes the proof.

Example (The p-series).
For what values of p is the series \sum_{n=1}^\infty\frac{1}{n^p} convergent?

Solution. If p<0 then \lim_{n\to\infty}\frac{1}{n^p}=\infty. If p=0 then \lim_{n\to\infty}\frac{1}{n^p}=1. In either case, \lim_{n\to\infty}\frac{1}{n^p}\ne 0, so the series diverges. If p>0 then the function f(x)=\frac{1}{x^p} is continuous, positive and decreasing on [1,\infty).
Now,
\int_1^\infty\frac{1}{x^p}dx=\left\{\begin{array}{ccc} \left.\frac{x^{-p+1}}{-p+1}\right|_1^\infty & {\rm if} & p\ne 1,\\ \\ \ln x|_1^\infty & {\rm if} & p=1. \end{array}\right.
Therefore the series converges if p>1 and diverges if p\leq 1.

Example. Test the series \sum_{n=1}^\infty\frac{1}{n^2+1} for convergence or divergence.

Solution. f(x)=\frac{1}{x^2+1} is continuous, positive and decreasing on [1,\infty). \begin{align*}\int_1^\infty\frac{1}{x^2+1}dx&=\left.\arctan x\right|_1^\infty\\&=\arctan \infty-\arctan 1\\&=\frac{\pi}{4}\end{align*} Therefore, by the Integral Test the series converges.

Example. Determine whether \sum_{n=1}^\infty\frac{\ln n}{n} converges or diverges.

Solution. f(x)=\frac{\ln x}{x} is continuous, positive and decreasing on [3,\infty). (One can easily check f(x) is decreasing on (e,\infty) by its derivative f'(x).) \begin{align*}\int_3^\infty\frac{\ln x}{x}dx&=\frac{1}{2}\left.(\ln x)^2\right|_3^\infty\\&=\infty\end{align*} Therefore, \sum_{n=1}^\infty \frac{\ln n}{n} diverges.

Example. Use the integral test to show that the series \sum_{n=1}^\infty\frac{1}{a^{\ln x}} converges if a>e and diverges if 0<a\leq e.

Proof. Let f(x)=\frac{1}{a^{\ln x}}. Then f(x) is positive and continuous on (1,\infty). If 0<a<1 then the series \sum_{n=1}^\infty\frac{1}{a^{\ln n}} diverges because the sequence \left\{\frac{1}{a^{\ln n}}\right\} is increasing. If a=e, the series becomes the harmonic series \sum_{n=1}^\infty\frac{1}{n} which diverges. Now we assume that 1\leq a< e or a>e. Then f(x) is decreasing (1,\infty). \begin{align*}\int_1^\infty\frac{dx}{a^{\ln x}}&=\int_1^\infty a^{-\ln x}dx\\&=-\int_0^{-\infty}a^ue^{-u}du\ (u=-\ln x,\ dx=-xdu=-e^{-u}dx)\\&=\int_{-\infty}^0a^ue^{-u}du\end{align*} Let v=a^u and dw=e^{-u}du. Then dv=a^u\ln a du and w=-e^{-u}. The integration by parts formula \int vdw=vw-\int wdv results in \int a^ue^{-u}du=-a^ue^{-u}+\ln a\int e^{-u}a^udu+C’ Hence we find \int a^ue^{-u}du=\frac{a^ue^{-u}}{\ln a-1}+C \begin{align*}\int_1^\infty\frac{dx}{a^{\ln x}}&=\int_{-\infty}^0a^ue^{-u}du\\&=\left[\frac{a^ue^{-u}}{\ln a-1}\right]_{-\infty}^0\\&=\frac{1}{\ln a-1}\left\{1-\lim_{u\to -\infty}\frac{a^u}{e^u}\right\}\end{align*} While \frac{a^u}{e^u} is an indeterminate form of type \frac{\infty}{\infty}, the L’Hôpital’s Rule is not helpful for finding the limit \lim_{u\to -\infty}\frac{a^u}{e^u} as \frac{(a^u)’}{(e^u)’}=\frac{a^u\ln a}{e^u}. Instead let y=\frac{a^u}{e^u}. Then \ln y=u(\ln a-1). \lim_{u\to -\infty}\ln y=\left\{\begin{array}{ccc}-\infty & \mbox{if} & a>e\\\infty & \mbox{if} & a<e\end{array}\right. i.e. \lim_{u\to -\infty}\frac{a^u}{e^u}=\left\{\begin{array}{ccc}e^{-\infty}=0 & \mbox{if} & a>e\\e^{\infty}=\infty & \mbox{if} & a<e\end{array}\right. Therefore, \int_1^\infty\frac{dx}{a^{\ln x}}=\left\{\begin{array}{ccc}\frac{1}{\ln a-1}<\infty & \mbox{if} & a>e\\\infty & \mbox{if} & a<e\end{array}\right. This completes the proof.

Theorem (Remainder Estimate for the Integral Test)
If \sum_{n=1}^\infty a_n converges by the Integral Test and R_n=S-s_n, then
\begin{equation}\label{eq:remest}\int_{n+1}^\infty f(x)dx\leq R_n\leq\int_n^\infty f(x)dx\end{equation}

Proof. Using the left-end point method we obtain R_n=a_{n+1}+a_{n+2}+\cdots\geq\int_{n+1}^\infty f(x)dx as seen in Figure 3.

Figure 3. Remainder Estimate

Now using the right-end point method we obtain R_n=a_{n+1}+a_{n+2}+\cdots\leq\int_n^\infty f(x)dx as seen in Figure 4.

Figure 4. Remainder Estimate

Hence proves \eqref{eq:remest}.

Example.

  1. Approximate the sum of the series \sum_{n=1}^\infty\frac{1}{n^3} by using the sum of the first 10 terms. Estimate the error involved in this approximation.
  2. How many terms are required to ensure that the sum is accurate to within 0.0005?

Solution. First we calculate \int_n^\infty\frac{1}{x^3}dx=\frac{1}{2n^2}

  1. s_{10}=\frac{1}{1^3}+\frac{1}{2^3}+\cdots+\frac{1}{10^3}\approx 1.197532. By the remainder estimate \eqref{eq:remest} R_{10}\leq\int_{10}^\infty\frac{1}{x^3}dx=\frac{1}{200}=0.005 So the size of the error is at most 0.005.
  2. R_n\leq\int_n^\infty\frac{1}{x^3}dx=\frac{1}{2n^2}.  Suppose \frac{1}{2n^2}<0.0005. Then we find n>\sqrt{1000}\approx 31.6. This means we need 32 terms to guarantee accuracy to within 0.0005.

Corollary. \begin{equation}\label{eq:sumest}s_n+\int_{n+1}^\infty f(x)dx\leq s\leq s_n+\int_n^\infty f(x)dx\end{equation}

Proof. Add s_n to each side of the inequalities in \eqref{eq:remest}

Example. Use the inequality \eqref{eq:sumest} with n=10 to estimate the sum of the series \sum_{n=1}^\infty\frac{1}{n^3}.

Solution. Using \eqref{eq:sumest} for n=10 we have s_{10}+\int_{11}^\infty\frac{1}{x^3}dx\leq s\leq s_{10}+\int_{10}^\infty\frac{1}{x^3}dx i.e. s_{10}+\frac{1}{2(11)^2}\leq s\leq s_{10}+\frac{1}{2(10)^2} Hence we get 1.201664\leq s\leq 1.202532 We can approximate s by taking the midpoint of this interval (i.e. the average of the boundary points) which is s\approx 1.2021. The error is then at most half the length of the interval i.e. the error is smaller than 0.0005. Recall that we had to use 32 terms to make error smaller than 0.0005 in the previous example but in this example we needed only 10 terms. So we can obtain a much improved estimate using \eqref{eq:sumest} than using s_n.