Numerical differentiation

GUIDING QUESTION: How can I compute (numerically) the derivative of a function?

Motivation

We'll focus on the following two questions.
  1. Given a smooth function $f$ defined by a data table, how can I approximate the value of $f'$ at a point?
  2. Given a function $f$, or even just an oracle evaluating $f$, how can I approximate values of $f'$ by linear combinations of values of $f$?
    This is common in applications dealing with complex functions, like those describing forces in molecular dynamics simulations, or deep neural networks.
    To be fair, ML frameworks like TensorFlow also provide oracles for evaluating gradients analytically.

Approaches

There are two main approaches.
  1. Given a data table, construct the interpolating polynomial (or spline) and evaluate the polynomial's derivative analytically.
    The exact derivative of the polynomial interpolant serves to approximate the derivative of the function defined by the data.
  2. Use a linear system!
    Use Taylor's theorem to find which linear combinations of function values approximate the function derivative at a given point.

Via interpolant

Recall that the Lagrange form of the polynomial interpolating the data points $(x_0, f_0), \ldots,(x_n,f_n)$ is given by $$p(x) =\sum^n_{j=0} f_jL_{j}(x).$$
So we approximate $$f'(x_i) \color{var(--emphColor)}{\approx} p'(x_i) = \sum^n_{j=0}f_jL'_{j}(x_i).$$

Via interpolant

Setting $d_{ij} = L'_{j}(x_i)$, we have \begin{align} \begin{bmatrix} f'(x_0)\\ \vdots\\ f'(x_n) \end{bmatrix} \color{var(--emphColor)}{\approx} \begin{bmatrix} p'(x_0)\\ \vdots\\ p'(x_n) \end{bmatrix} = \begin{bmatrix} \sum_j d_{0,j}f_j\\ \vdots\\ \sum_j d_{n,j}f_j \end{bmatrix} =D\mathbf{f}, \end{align} with $\mathbf{f} = [f_0\cdots f_n]^T$.

Via interpolant

A direct calculation shows that \begin{align} d_{ij} = L'_{n,j}(x_i)= \begin{cases} \frac{w_j/w_i}{x_i - x_j}, & i\not=j\\ \sum_{i\not=j} L'_n,j(x_i), & i =j. \end{cases} \end{align}
Try to show this!
Let $n = 2$, consider the three equally spaced nodes $x_0,x_1$, and $x_2$, and set $h = x_{j+1} - x_j$.
The interpolating polynomial is given by $$p(x) = \frac{(x-x_1)(x-x_2)}{(x_0-x_1)(x_0-x_2)} \color{var(--emphColor)}{f_0} + \frac {(x-x_0)(x-x_2)} {(x_1-x_0)(x_1-x_2)} \color{var(--emphColor)}{f_1} + \frac{(x-x_0)(x-x_1)}{(x_2-x_0)(x_2-x_1)}\color{var(--emphColor)}{f_2}, $$
and \begin{align} p'(x)&= \frac{1}{2h^2}\big[(2x-x_1-x_2)\color{var(--emphColor)}{f_0} - 2(2x - x_0 -x_2) \color{var(--emphColor)}{f_1} +(2x- x_0- x_1)\color{var(--emphColor)}{f_2}\big]. \end{align}
Let $n = 2$, consider the three equally spaced nodes $x_0,x_1$, and $x_2$, and set $h = x_{j+1} - x_j$. \begin{align} p'(x)&= \frac{1}{2h^2}\big[(2x-x_1-x_2)\color{var(--emphColor)}{f_0} - 2(2x - x_0 -x_2) \color{var(--emphColor)}{f_1} +(2x- x_0- x_1)\color{var(--emphColor)}{f_2}\big]. \end{align}