<< Chapter < Page | Chapter >> Page > |
Let $f$ be in ${C}^{n}\left({B}_{r}\left(c\right)\right)$ for $c$ a fixed complex number, $r>0,$ and $n$ a positive integer. Define the Taylor polynomial of degree $n$ for $f$ at $c$ to be the polynomial ${T}^{n}\equiv {T}_{(f,c)}^{n}$ given by the formula:
where ${a}_{j}={f}^{\left(j\right)}\left(c\right)/j!.$
REMARK If $f$ is expandable in a Taylor series on ${B}_{r}\left(c\right),$ then the Taylor polynomial for $f$ of degree $n$ is nothing but the $n$ th partial sum of the Taylor series for $f$ on ${B}_{r}\left(c\right).$ However, any function that is $n$ times differentiable at a point $c$ has a Taylor polynomial of order $n.$ Functions that are infinitely differentiable have Taylor polynomials of all orders, and we might suspect that these polynomials are some kind of good approximation to the function itself.
Prove that $f$ is expandable in a Taylor series function around a point $c$ (with radius of convergence $r>0$ ) if and only if the sequence $\left\{{T}_{(f,c)}^{n}\right\}$ of Taylor polynomials converges pointwise to $f;$ i.e.,
for all $z$ in ${B}_{r}\left(c\right).$
Let $f\in {C}^{n}\left({B}_{r}\left(c\right)\right).$ Prove that ${f}^{\text{'}}\in {C}^{n-1}\left({B}_{r}\left(c\right)\right).$ Prove also that ${\left({T}_{(f,c)}^{n}\right)}^{\text{'}}={T}_{({f}^{\text{'}},c)}^{n-1}.$
The next theorem is, in many ways, the fundamental theorem of numerical analysis. It clearly has to do withapproximating a general function by polynomials. It is a generalization of the Mean Value Theorem, and as in that casethis theorem holds only for real-valued functions of a real variable.
Let $f$ be a real-valued function on an interval $(c-r,c+r),$ and assume that $f\in {C}^{n}\left((c-r,c+r)\right),$ and that ${f}^{\left(n\right)}$ is differentiable on $(c-r,c+r).$ Then, for each $x$ in $(c-r,c+r)$ there exists a $y$ between $c$ and $x$ such that
REMARK If we write $f\left(x\right)={T}_{f,c}^{n})\left(x\right)+{R}_{n+1}\left(x\right),$ where ${R}_{n+1}\left(x\right)$ is the error or remainder term, then this theorem gives a formula, and hence an estimate, for that remainder term.This is the evident connection with Numerical Analysis.
We prove this theorem by induction on $n.$ For $n=0,$ this is precisely the Mean Value Theorem. Thus,
Now, assuming the theorem is true for all functionsin ${C}^{n-1}\left((c-r,c+r)\right),$ let us show it is true for the given function $f\in {C}^{n}\left((c-r,c+r)\right).$ Set $g\left(x\right)=f\left(x\right)-\left({T}_{(f,c)}^{n}\right)\left(x\right)$ and let $h\left(x\right)={(x-c)}^{n+1}.$ Observe that both $g\left(c\right)=0$ and $h\left(c\right)=0.$ Also, if $x\ne c,$ then $h\left(x\right)\ne 0.$ So, by the Cauchy Mean Value Theorem, we have that
for some $w$ between $c$ and $x.$ Now
(See the preceding exercise.), and ${h}^{\text{'}}\left(w\right)=(n+1){(w-c)}^{n}.$ Therefore,
We apply the inductive hypotheses to the function ${f}^{\text{'}}$ (which is in ${C}^{n-1}\left((c-r,c+r)\right))$ and obtain
for some $y$ between $c$ and $w.$ But this implies that
for some $y$ between $c$ and $x,$ which finishes the proof of the theorem.
Define $f\left(x\right)=0$ for $x\le 0$ and $f\left(x\right)={e}^{-1/x}$ for $x>0.$ Verify that $f\in {C}^{\infty}\left(R\right),$ that ${f}^{\left(n\right)}\left(0\right)=0$ for all $n,$ and yet $f$ is not expandable in a Taylor series around $0.$ Interpret Taylor's Remainder Theorem for this function. That is, describe the remainder ${R}_{n+1}\left(x\right).$
As a first application of Taylor's Remainder Theorem we give the following result, which should be familiar from calculus.It is the generalized version of what's ordinarily called the “second derivative test.”
Let $f$ be a real-valued function in ${C}^{n}(c-r,c+r),$ suppose that the $n+1$ st derivative ${f}^{(n+1)}$ of $f$ exists everywhere on $(c-r,c+r)$ and is continuous at $c,$ and suppose that ${f}^{\left(k\right)}\left(c\right)=0$ for all $1\le k\le n$ and that ${f}^{(n+1)}\left(c\right)\ne 0.$ Then:
Since ${f}^{(n+1)}$ is continuous at $c,$ there exists a $\delta >0$ such that ${f}^{(n+1)}\left(y\right)$ has the same sign as ${f}^{(n+1)}\left(c\right)$ for all $y\in (c-\delta ,c+\delta ).$ We have by Taylor's Theorem that if $x\in (c-\delta ,c+\delta )$ then there exists a $y$ between $x$ and $c$ such that
from which it follows that
Suppose $n$ is even. It follows then that if $x<c,$ the sign of ${(x-c)}^{n+1}$ is negative, so that the sign of $f\left(x\right)-f\left(c\right)$ is the opposite of the sign of ${f}^{(n+1)}\left(c\right).$ On the other hand, if $x>c,$ then ${(x-c)}^{n+1}>0,$ so that the sign of $f\left(x\right)-f\left(c\right)$ is the same as the sign of ${f}^{(n+1)}\left(c\right).$ So, $f\left(x\right)>f\left(c\right)$ for all nearby $x$ on one side of $c,$ while $f\left(x\right)<f\left(c\right)$ for all nearby $x$ on the other side of $c.$ Therefore, $f$ attains neither a local maximum nor a local minimum at $c.$ This proves part (1).
Now, if $n$ is odd, the sign of $f\left(x\right)-f\left(c\right)$ is the same as the sign of ${f}^{(n+1)}\left(y\right),$ which is the same as the sign of ${f}^{(n+1)}\left(c\right),$ for all $x\in (c-\delta ,c+\delta ).$ Hence, if ${f}^{(n+1)}\left(c\right)<0,$ then $f\left(x\right)-f\left(c\right)<0$ for all $x\in (c-\delta ,c+\delta ),$ showing that $f$ attains a local maximum at $c.$ And, if ${f}^{(n+1)}\left(c\right)>0,$ then the sign of $f\left(x\right)-f\left(c\right)$ is positive for all $x\in (c-\delta ,c+\delta ),$ showing that $f$ attains a local minimum at $c.$ This proves parts (2) and (3).
Notification Switch
Would you like to follow the 'Analysis of functions of a single variable' conversation and receive update notifications?