4.2. Root finding methods#

4.2.1. Fixed point iteration#

If we want to numerically solve \(f(x) = 0\), rearrange \(f(x)\) to get \(g(x) = x\). Then, choose an inital guess \(x_0\) and iterate:

(4.4)#\[\begin{equation} x_{n+1} = g(x_n) \end{equation}\]

until \(x_{n+1} = x_n\) to the desired precision (error tolerance).

Example: Using fixed point iteration

To solve

(4.5)#\[\begin{equation} e^{-x^2} - x = 0 \end{equation}\]

First rearrange

(4.6)#\[\begin{align} x = e^{-x^2} = g(x) \end{align}\]

Then, guess \(x_0 = 0\) and set up a table:

\(n\)

\(x_n\)

\(g(x_n)\)

0

0

1.000

1

1.000

0.368

2

0.368

0.873

51

0.653

0.6528

At each iteration, \(g(x_n)\) is used as the new \(x_n\) until at \(n = 5\), where \(x_n = 0.6530 \approx 0.6528 = g(x_n)\).

The choice of rearranging \(f(x)\) is not unique, and some choices may be better than others. For example, convergence is only guaranteed when \(|g'(x)| \le k < 1\) for all \(x\) in an interval around the solution. Experience will guide your choice of rearrangement.

Example: Rearranging functions

To solve

(4.7)#\[\begin{equation} e^x - x^3 = 0 \end{equation}\]

One possible rearrangement is

(4.8)#\[\begin{align} e^x &= x^3 \\ x &= \ln(x^3) = 3 \ln x = g(x) \end{align}\]

Note that this method may fail spectacularly! In that case, choosing a better initial guess or rearrangement may help. Additionally, you can try to improve the stability of the method by slowly “mixing” solutions:

(4.9)#\[\begin{align} x_{n+1}^* &= g(x_n) \\ x_{n+1} &= \alpha x_{n+1}^* + (1-\alpha) x_n \end{align}\]

where \(\alpha\) is a mixing parameter. Smaller values of \(\alpha\) mix more slowly, which can be more stable but also requires more iterations.

4.2.3. Newton-Raphson method#

Although we cannot immediately solve the nonlinear equation \(f(x) = 0\), we can first linearize it, then solve the linear problem.

(4.11)#\[\begin{align} f(x) &\approx f(x_0) + f'(x_0)(x - x_0) = 0 \\ x &\approx x_0 - \frac{f(x_0)}{f'(x_0)} \end{align}\]

Iterating this process gives the Newton-Raphson method of root finding

(4.12)#\[x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}\]

The algorithm for the Newton-Raphson method is:

  1. Guess \(x_0\) and set \(n = 0\).

  2. Compute \(f(x_n)\) and \(f'(x_n)\). If \(f(x_n)\) is “close” to zero, stop.

  3. Update \(x_{n+1}\) using eq. (4.12) then return to step 2.

Newton-Raphson method

Note that this method can converge much more rapidly than the fixed-point or bisection methods. However, it will fail if \(f'(x_n) = 0\).

Example: Newton-Raphson method

Solve \(x^2 = 2\).


We can rewrite this as \(f(x) = x^2-2 = 0\). We will also evaluate the derivative \(f'(x) = 2x\). Let the initial guess be \(x_0 = 1\).

\(n\)

\(x_n\)

\(f(x_n)\)

\(f'(x_n)\)

\(x_{n+1}\)

\(0\)

\(1.0\)

\(-1.0\)

\(2.0\)

\(1.5\)

\(1\)

\(1.5\)

\(0.25\)

\(3.0\)

\(1.417\)

\(2\)

\(1.417\)

\(6.94 \times 10^{-3}\)

\(2.833\)

\(1.414\)

This is close to the known value of \(\sqrt{2}\)!

Example: Newton-Raphson method 2

Solve \(e^{-x^2} - x = 0\).


We define \(f(x) = e^{-x^2} - x\) and calculate \(f'(x) = -2x e^{-x^2} - 1\). We choose an initial guess \(x_0 = 0\).

\(n\)

\(x_n\)

\(f(x_n)\)

\(f'(x_n)\)

\(x_{n+1}\)

\(0\)

\(0\)

\(1.0\)

\(-1.0\)

\(1.0\)

\(1\)

\(1.0\)

\(-0.6321\)

\(-1.736\)

\(0.6358\)

\(2\)

\(0.6358\)

\(0.03164\)

\(-1.849\)

\(0.6529\)

Note the rapid convergence compared to the methods above!