Abstract
Lecture notes from the Integration Workshop at the University of Arizona. The workshop aims to revisit a few select topics of undergraduate mathematics from a graduate perspective. The notes draw from the text “An introduction to real analysis” by John K. Hunter, and we refer the interested reader to section 8.6, 10.1-3 and 12.4 for further details. These notes contain more material than I intend to cover in the lecture. I urge you to go through all of it, even the parts that I don’t get to in the lecture.
1 Taylor’s Theorem, Taylor Series & Analytic Functions
Objectives:
-
Apply Taylor’s theorem to estimate the accuracy of a Taylor polynomial approximation,
-
Determine whether a power series converges,
-
Determine whether a function is analytic.
1.1 Approximating Polynomials & Taylor’s Theorem
Definition 1 (Taylor Polynomial). Let \(f:(a,b) \to \mathbb {R}\) and suppose that \(f\) has \(n\) derivatives on \((a,b)\). The Taylor polynomial of degree \(n\) of \(f\) at \(a < c < b\) is defined as \begin {equation} P_n(x) = \sum _{k = 0}^n a_k (x-c)^k, \quad \text {where} \quad a_k = \frac {1}{k!} f^{(k)}(c) \end {equation}
The difference between \(f\) and \(P_n\) is often called the remainder, or \(R_n(x) := f(x) - P_n(x)\). There are several variations of Taylor’s Theorem, one of which is as follows:
Theorem 2 (Taylor’s Theorem). Suppose that \(f:(a,b) \to \mathbb {R}\) has \(n+1\) derivatives on \((a,b)\) and let \(a < c < b\). For every \(a < x < b\), there exists \(\xi \) between \(c\) and \(x\) such that \begin {equation} f(x) = \sum _{k = 0}^n \frac {1}{k!}f^{(k)}(c)\, (x-c)^k + R_n(x) \end {equation} where \begin {equation} R_n(x) = \frac {1}{(n+1)!} f^{(n+1)} (\xi ) (x-c)^{n+1} \end {equation}
Example 3. Use Taylor’s theorem to prove that \begin {equation} \lim _{x \to 0} \left ( \frac {1- \cos x}{x^2}\right ) = \frac {1}{2} \end {equation} and determine the rate of convergence
1.2 Representation of a Function by its Taylor Series
One has to wonder if there is a meaningful way to interpret Taylor’s Theorem in the limit \(n \to \infty \). It is one thing to approximate a function by a polynomial (necessarily of finite degree), and quite another to truly represent the function by a power series. The potential challenges I would like to discuss are (i) the smoothness of the function, (ii) the convergence of the power series, and (iii) whether the power series does actually converge to the value of the function.
1.2.1 Infinitely Differentiable Functions
First, and most obviously, it is impossible to compute the terms in the power series \begin {equation} \label {eq:power-series} \sum _{k = 0}^\infty \frac {f^{(k)}(c)}{k!} \,(x-c)^k, \end {equation} if \(f\) is not infinitely differentiable. A function must be infinitely differentiable on some domain to be represented by a Taylor Series.
Example 4. Consider the polynomial \(f(x) = 1 + 2x + 10x^2\). We immediately see that \(f(x)\) can be differentiated infinitely many times (its derivatives are \(f'(x) = 2 + 20x\), \(f''(x) = 20\) and \(f^{(k)}(x) = 0\) for \(k = 3,4,\dots \)), and that evaluating \(f\) and each of its derivatives for any \(c \in \mathbb {R}\) gives a finite number. Therefore, for any \(x \in \mathbb {R}\), we can be confident that each term in the power series does actually exist.
Example 5. Determine whether each function is infinitely differentiable on the stated intervals.
- (a)
-
\(f(x) = \frac {1}{1-x}\) on (i) \(-1 < x < 1\), and (ii) \(x \in \mathbb {R}\).
- (b)
-
\(f(x) = x\, |x| \) on (i) \(1< x < \infty \), and (ii) \(x \in \mathbb {R}\).
- (c)
-
\(f(x) = x^{4/3}\) on (i) \(1< x < \infty \), and (ii) \(x \in \mathbb {R}\).
1.2.2 Convergence of a Power Series
Second, we will require that the power series must converge. In example 4, we were able to sum the Taylor series for each value of \(x\) and obtain a finite value. The summability of the Taylor Series of any polynomial is straightforward because only a finite number of terms are non-zero. Let’s consider a more interesting example.
Example 6. The \(n\)th degree Taylor polynomial for \(f(x) = 1/(1-x)\) centered at \(x = 0\) on the interval \(a < x < b\), with \(a\) and \(b\) to be determined, is given by \begin {equation} P_n(x) = \sum _{k = 0}^n \frac {f^{(k)}(0)}{k!} (x-0)^k = 1 + x + x^2 + \cdots + x^n. \end {equation} Taylor’s Theorem states that there exists \(\xi \) between \(0\) and \(x\) such that the remainder, \(:= f(x) - P_n(x)\) is \begin {equation} R_n(x) := \frac {f^{(n+1)}(\xi )}{(n+1)!} x^{n+1} = \left(\frac {x}{1-\xi }\right)^{n+1} \end {equation} If \(|x|< 1\), we can guarantee that \(|\frac {x}{1-\xi }|< 1\) so that \(R_n(x) \to 0\) as \(n \to \infty \). However, if \(|x|> 1\), no such guarantee exists.
Definition 7. The radius of convergence of a power series is the value \(R\) such that the power series converges for \(0 \leq |x-c| < R\) and diverges for \(|x-c| > R\). The radius of convergence satisfies \(0 \leq R \leq \infty \).
1.2.3 Convergence to
Given that the power series converges for a particular value of \(x\), we must still establish what value it converges to, and in particular, whether it converges to \(f(x)\). Yes, I know, this is something of an overly trivial example, but there are more interesting ones ahead. To verify that the power series truly is an alternative representation of the original function within the radius of convergence, we must investigate the limit \begin {equation} \lim _{n \to \infty }\left |f(x) - \sum _{k = 0}^n \frac {f^{(k)}(c)}{k!} (x-c)^k \right | \end {equation} In example 4, we could say that the power series was an alternate representation of the original function because for each value of \(x\), the power series returns \(f(x)\).
Example 9. Consider the function \begin {equation} f(x) = \begin {cases} e^{-1/x} & x > 0\\ 0 &x \leq 0 \end {cases} \end {equation} Determine whether it is (i) infinitely differentiable, (ii) the radius of convergence of it’s Taylor series centered at 0, and (iii) the interval on which it’s Taylor series converges to \(f\).
1.3 Analytic Functions
There are a number of theorems in mathematics that have been proven for analytic functions
Definition 10. A function \(f(x)\) is said to be (real) analytic at a point \(x_0\) if
-
\(f(x)\) is infinitely differentiable at \(x_0\) (See §1.2.1).
-
the radius of convergence of the Taylor series centered at \(x = 0\) is non-zero (See §1.2.2).
-
the Taylor series converges to the function in a neighborhood of \(x_0\) (See §1.2.3).
A function is said to be real analytic on a domain \(\mathcal {A}\) if it is (real) analytic at at each \(x_0 \in \mathcal {A}\), and is said to be entire if it is (real) analytic for each \(x_0 \in \mathbb {R}\)
Example 11. Identify the domains on which the following functions are (real) analytic.
- (a)
-
\(f(x) = 1 - 3x + 4x^3\)
- (b)
-
\(f(x) = e^x\)
- (c)
-
\(f(x) = 1/ (1-x) \)
- (d)
-
\(f(x) = |x|\)
2 Improper Integrals
Objectives
- 1.
-
Understand convergent integrals as the result of a limit process,
- 2.
-
Investigate the “boundary” between convergent and divergent integrals,
- 3.
-
Apply the comparison test to determine whether an integral converges.
2.1 Improper Integrals
The first time you learned about integration, you may have been asked to find the distance travelled by a particle travelling with velocity \(v(t)\) on the interval \([0,T]\). I’ll wager that \(|v(t)|< \infty \) and that \(T< \infty \)); in other words, you were asked to integrate a bounded function on a bounded interval. It can be of both theoretical and of practical interest to extend the definition of integration, when appropriate, to unbounded functions and to unbounded intervals.
2.1.1 Integrals of Unbounded Functions
Definition 12. An improper integral of the form \begin {equation} \int _a^b f(x) \, dx = \lim _{\epsilon \to 0^+} \int _{a + \epsilon }^b f(x) \, dx \end {equation} is said to converge if the limit exists. Similarly, an improper integral of the form \begin {equation} \int _a^b f(x) \, dx = \lim _{\epsilon \to 0^-} \int _{a}^{b-\epsilon } f(x) \, dx \end {equation} is said to converge if the limit exists.
Example 13. Show that \(\int _0^1 \log (x) \, dx \) convgerges. \begin {equation} \int _0^1 \log (x) \, dx = \lim _{\epsilon \to 0^+}\int _\epsilon ^1 \log (x) \, dx = \lim _{\epsilon \to 0^+}\Big [ x\, \log (x)\Big ]_\epsilon ^1 - \lim _{\epsilon \to 0^+}\int _\epsilon ^1 1\, dx = 0 - 0 + 1 = 1 \end {equation}
Example 14. Show that \(\int _0^1 \frac {1}{x} \, dx \) diverges. \begin {equation} \int _0^1 \frac {1}{x} \, dx = \lim _{\epsilon \to 0^+}\int _\epsilon ^1 \frac {1}{x} \, dx = \lim _{\epsilon \to 0^+}\Big [ \log (x)\Big ]_\epsilon ^1 = \lim _{\epsilon \to 0^+}\log (\epsilon ^{-1}) \to \infty \end {equation}
2.1.2 Integrals on Unbounded Domains
Definition 15. An improper integral of the form \begin {equation} \int _a^\infty f(x) \, dx = \lim _{M \to \infty } \int _a^M f(x) \, dx \end {equation} is said to converge if the limit exists. Similarly, an improper integral of the form \begin {equation} \int _{-\infty }^b f(x) \, dx = \lim _{M \to \infty } \int _{-M}^b f(x) \, dx \end {equation} is said to converge if the limit exists.
Example 16. Show that \(\int _0^\infty e^{-kx}\, dx\) converges. \begin {equation} \int _0^\infty e^{-kx}\, dx = \lim _{M \to \infty } \int _0^M e^{-kx}\, dx = \lim _{M \to \infty } \Big [-\frac {1}{k} e^{-kx} \Big ]_0^M = \lim _{M \to \infty } -\frac {1}{k} e^{-kM} + \frac {1}{k} = \frac {1}{k} \end {equation}
2.2 “Boundary” between Convergent and Divergent Integrals
We can look for the “boundary” between convergent and divergent integrals:
Example 18. Determine for what values of \(p\) does \( \int _0^1 \frac {1}{x^p}\, dx\) converge. \begin {equation} \int _0^1 \frac {1}{x^p}\, dx = \lim _{\epsilon \to 0^+} \int _\epsilon ^1 \frac {1}{x^p}\, dx = \lim _{M \to \infty } \Big [ \frac {x^{1-p}}{1-p} \Big ]_\epsilon ^1 = \frac {1}{1-p} -\lim _{\epsilon \to 0^+} \frac {\epsilon ^{1-p}}{1-p} \end {equation} The integral converges to \(\frac {1}{1-p}\) for \(p<1\) and diverges otherwise.
Example 19. Determine for what values of \(p\) does \(\int _1^\infty \frac {1}{x^p}\, dx\) converge. \begin {equation} \int _1^\infty \frac {1}{x^p}\, dx = \lim _{M \to \infty } \int _1^M \frac {1}{x^p}\, dx = \lim _{M \to \infty } \Big [ \frac {x^{1-p}}{1-p} \Big ]_1^M = \lim _{M \to \infty } \frac {M^{1-p}}{1-p} - \frac {1}{1-p} \end {equation} The integral converges to \(\frac {1}{p-1}\) for \(p>1\) and diverges otherwise.
Examples 18 and 19 illustrate that there is a ‘boundary’ between convergent and divergent integrals. One asks whether this boundary can be further refined.
Example 20. Determine whether \(\displaystyle \int _1^\infty \frac {1}{x\, [\ln (x)]^\alpha }\, dx\) converges for various values of \(\alpha \).
2.3 Comparison Test
Theorem 22. If \(f(x) \leq g(x)\) on \(\mathcal {A} \subseteq \mathbb {R}\) and \(\int _\mathcal {A} g(x) \, dx\) converges, then \(\int _\mathcal {A} f(x) \, dx\) converges.
Theorem 23. If \(f(x) \geq g(x)\) on \(\mathcal {A} \subseteq \mathbb {R}\) and \(\int _\mathcal {A} g(x) \, dx\) diverges, then \(\int _\mathcal {A} f(x) \, dx\) diverges.