Motivation: The Quake Fast Inverse Square Root¶
Before diving into theory, consider this legendary piece of code from Quake III Arena (1999):
float Q_rsqrt(float x) {
long i;
float x2, y;
x2 = x * 0.5F;
y = x;
i = *(long*)&y; // interpret float bits as integer
i = 0x5f3759df - (i >> 1); // magic!
y = *(float*)&i;
y = y * (1.5F - (x2 * y * y)); // Newton refinement
return y;
}This computes —a function that cannot be evaluated directly using just . How do we turn this into something computable?
Translating to a Root-Finding Problem¶
Why translate to root finding? Because you already know an algorithm for finding roots: Newton’s method from calculus.
Newton’s Method Refresher¶
Recall: to solve , approximate by its tangent line and find where it crosses zero:
Applying to
The floating-point bit tricks provide a clever initial guess, and one Newton iteration refines it to sufficient accuracy.
The Root-Finding Problem¶
Given a nonlinear function , find such that .
Examples:
has root
has roots
has a root near
For most nonlinear equations, we cannot find closed-form solutions. We need a different approach.
The Big Picture: Iterative Methods¶
The Quake example illustrates a fundamental pattern in scientific computing:
The key questions for any iterative method are:
Does it converge? Under what conditions?
How fast? How many iterations do we need?
How sensitive is the answer? (This is where condition numbers come in.)
A Unifying Principle¶
Newton’s method, bisection, and fixed-point iteration may seem like three unrelated techniques. But they share a common structure: each generates a sequence that (hopefully) converges to the root.
The key insight is that convergence happens when the iteration shrinks errors:
This principle—formalized as the Banach Fixed Point Theorem in Fixed Point Iteration—is one of the most important ideas in numerical analysis. It explains:
Why Newton’s method converges (when it does)
Why bisection always converges (it halves the interval each step)
Why iterative linear solvers like Jacobi and Gauss-Seidel work
The same idea recurs throughout this course: in ODE integrators, optimization algorithms, and beyond.
Learning Outcomes¶
After completing this chapter, you should be able to:
L3.1: Derive Newton’s method from Taylor series.
L3.2: Implement Newton’s method in code.
L3.3: Explain quadratic convergence.
L3.4: Identify when Newton’s method fails.
L3.5: State the Intermediate Value Theorem.
L3.6: Derive the bisection algorithm.
L3.7: Calculate required bisection iterations.
L3.8: Define fixed points and restate root-finding.
L3.9: State the Banach Fixed Point Theorem.
L3.10: Apply convergence conditions.
L3.11: Compute condition numbers for roots.