Your result and our geometrical intuition suggest the following simple procedure: given $f$ and a starting point $x_0$, approximate a root of $f$ using $$x_{\color{var(--emphColor)}{n+1}} = x_\color{var(--emphColor)}{n} - \frac{f(x_\color{var(--emphColor)}{n})}{f'(x_\color{var(--emphColor)}{n})}.$$
Convergence theorem for fixed point iteration schemes is the theoretical backbone of this analysis.
This is one of the harder convergence proofs we'll discuss in detail!
If $p$ is a simple root of $f$, then $f(p) = 0$ and therefore $$g(\color{var(--emphColor)}{p}) = p - {f(p) \over f'(p)} = \color{var(--emphColor)}{p}.$$
So we find a root of $f$ by approximating a fixed point of $g$.
To guarantee convergence of Newton's method as a fixed point iteration scheme, we need to:
When $f$ is differentiable and $p$ is a simple root, so $f(p) = 0$ but $f'(p) \neq 0$, it is clear that $$g(x) = x - {f(x) \over f'(x)}$$ is continuous for $x$ near $p$.
To guarantee convergence of Newton's method as a fixed point iteration scheme, we need to:
Continue assuming $f'(p) \neq 0$ and suppose further that $f$ is twice continuously differentiable. Then
\begin{align} g'(x) &= 1 - {\color{var(--emphColor)}{f'(x)f'(x)} - f(x)f''(x) \over \color{var(--emphColor)}{[f'(x)]^2}} \\ &= {f(x)f''(x) \over [f'(x)]^2} \end{align} is also continuous on the interval $I' = (p - \delta_1, p + \delta_1)$.
$ g'(x) = {f(x)f''(x) \over [f'(x)]^2}$