Procedure

Your result and our geometrical intuition suggest the following simple procedure: given $f$ and a starting point $x_0$, approximate a root of $f$ using $$x_{\color{var(--emphColor)}{n+1}} = x_\color{var(--emphColor)}{n} - \frac{f(x_\color{var(--emphColor)}{n})}{f'(x_\color{var(--emphColor)}{n})}.$$

Let's use Newton's method to approximate $\sqrt{2}$.
This is a root finding problem and in HW1 you will write out the recursive relation defining $x_n$ explicitly.
How good is the approximation $x_n$? Will the sequence $x_n$ always converge to a root of $f$?

Let's get technical

Convergence theorem for fixed point iteration schemes is the theoretical backbone of this analysis.

A fixed point iteration scheme mandates $x_{n+1} = g(x_n)$. Since $$x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)},$$ we take $g(x) = x - {f(x) \over f'(x)}$.

This is one of the harder convergence proofs we'll discuss in detail!

Root of $f \leftrightarrow$ fixed point of $g$

If $p$ is a simple root of $f$, then $f(p) = 0$ and therefore $$g(\color{var(--emphColor)}{p}) = p - {f(p) \over f'(p)} = \color{var(--emphColor)}{p}.$$

So we find a root of $f$ by approximating a fixed point of $g$.

To guarantee convergence of Newton's method as a fixed point iteration scheme, we need to:

  1. Find an interval $I$ containing $p$ such that $g$ is continuous on $I$.
  2. Show $g$ is differentiable on $I$ and $|g'(x)| \leq k < 1$ for some $k$ and every $x \in I$.
  3. Ensure that $g : I \to I$. That is, the image $g(I)$ of $I$ under $g$ is contained in $I$.

Continuity of $g$

When $f$ is differentiable and $p$ is a simple root, so $f(p) = 0$ but $f'(p) \neq 0$, it is clear that $$g(x) = x - {f(x) \over f'(x)}$$ is continuous for $x$ near $p$.

Technically, we say $g$ is continuous in a neighborhood of $p$.

Continuity of $g$

In particular, since $f'$ is continuous and $f'(p) \neq 0$, there exists $\delta_1 > 0$ such that $f'(x) \neq 0$ for all $x$ in the interval $I' = (p - \delta_1, p + \delta_1)$. So $g$ is continuous on this interval around $p$.

To guarantee convergence of Newton's method as a fixed point iteration scheme, we need to:

  1. Find an interval $I$ containing $p$ such that $g$ is continuous on $I$. $\color{var(--emphColor)}{\checkmark}$
We've used $I'$ for our interval because we'll need a smaller interval $I \subset I'$ to satisfy the three conditions.

Bounded growth

Continue assuming $f'(p) \neq 0$ and suppose further that $f$ is twice continuously differentiable. Then

\begin{align} g'(x) &= 1 - {\color{var(--emphColor)}{f'(x)f'(x)} - f(x)f''(x) \over \color{var(--emphColor)}{[f'(x)]^2}} \\ &= {f(x)f''(x) \over [f'(x)]^2} \end{align} is also continuous on the interval $I' = (p - \delta_1, p + \delta_1)$.

$ g'(x) = {f(x)f''(x) \over [f'(x)]^2}$