2.290 - Root Finding - keiji's notes## Order of Convergence
$
e_{n} = x_{n} - x^{e}
$
Convergence of order $p$ exists if there exists a constant $C\neq_{0}$ such that
$
\lim_{ n \to \infty } \frac{\lvert e_{n+1} \rvert}{\lvert e_{n} \rvert ^{p}}=C
$
## Bracketing Methods
| Name | $x_{n+1}$ | Convergence | Error | Convergence Condition |
| --------- | ------------------------------------------------- | ------------- | ----------- | ----------------------------------- |
| Bisection | $\frac{a_n + b_n}{2}$ | ${} p=1 {}$ | $O(2^{-n})$ | Opposite signs on $f(a)$ and $f(b)$ |
| False Pos | $b_n - \frac{f(b_n)(a_n - b_n)}{f(a_n) - f(b_n)}$ | ${} 1<p<2 {}$ | $O(\log n)$ | Opposite signs on $f(a)$ and $f(b)$ |
### Bisection
- Start with interval [a,b] where $f(a)$ and $f(b)$ are opposite signs
- Keep cutting interval in half until desired accuracy reached
- Reliable but slow
| Iteration Formula | Convergence Rate | Error | Convergence Condition |
| ------------------------------- | ---------------- | ----------- | ----------------------------------- |
| $x_{n+1} = \frac{a_n + b_n}{2}$ | ${} p=1 {}$ | $O(2^{-n})$ | Opposite signs on $f(a)$ and $f(b)$ |
### False position
Same process as bisection, but instead of dividing by two to find the new x-guess, use linear approximation between a and b to estimate new x-guess
1. Guaranteed to converge
2. Can get stuck for highly asymmetrical functions
| Iteration Formula | Convergence Rate | Error | Convergence Condition |
| ----------------------------------------------------------- | ---------------- | ----------- | ----------------------------------- |
| $x_{n+1} = b_n - \frac{f(b_n)(a_n - b_n)}{f(a_n) - f(b_n)}$ | ${} 1<p<2 {}$ | $O(\log n)$ | Opposite signs on $f(a)$ and $f(b)$ |
## Open Methods
| Name | $x_{n+1}$ | Convergence | Error | Convergence Condition |
| -------------- | ------------------------------------------------------------------- | --------------- | ---------------- | ----------------------------- |
| Fixed-Point | $g(x_{n})
lt;br> | ${} p=1 {}$ | ${} O(n) {}$ | $\|g’(x)\|<1$ for x near root |
| Newton-Raphson | $x_{n} - \frac{f(x_{n})}{f'(x_{n})}$ | ${} 1<p<2 {}$ | $O(\log \log n)$ | Nonzero slope near zeros |
| Secant | $x_{n} - \frac{f(x_{n})(x_{n-1}-x_{i})}{f(x_{n-1}-f(x_{n}))}$ | $p \approx 1.6$ | $O(\log \log n)$ | Nonzero slope near zeros |
### Fixed-Point Iteration
Rearrange $f(x)=0$ into $g(x)=x$ and iterate using that
| Iteration Formula | Convergence Rate | Error | Convergence Condition |
| ------------------------ | ---------------- | ------------ | ------------------------------------- |
| $x_{n+1} = g(x_{n})lt;br> | ${} p=1 {}$ | ${} O(n) {}$ | $\|g’(x)\|<1$ for all x near the root |
### Newton-Raphson
Find the root of the function linearized at $x$
Can be difficult to calculate the Jacobian $f'(x)$
| Iteration Formula | Convergence Rate | Error | Convergence Condition |
| ---------------------------------------------- | ---------------- | ---------------- | -------------------------------- |
| $x_{n+1} = x_{n} - \frac{f(x_{n})}{f'(x_{n})}$ | ${} 1<p<2 {}$ | $O(\log \log n)$ | No zero slope regions near zeros |
### Secant
Approximate jacobian with backward finite difference, at the cost of storage
$
f'(x_{n}) =\frac{ f(x_{n}) -f(x_{n-1})}{x_{n} - x_{n-1}}
$
| Iteration Formula | Convergence Rate | Error | Convergence Condition |
| ----------------------------------------------------------------------- | ---------------- | ---------------- | -------------------------------- |
| $x_{i+1} = x_{i} - \frac{f(x_{i})(x_{i-1}-x_{i})}{f(x_{i-1}-f(x_{i}))}$ | $p \approx 1.6$ | $O(\log \log n)$ | No zero slope regions near zeros |