Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

The Trapezoidal Rule

The simplest approach: approximate f(x)f(x) by a straight line and integrate that.

Single Interval

Local Error Analysis

What is the error of this approximation? Let h=bah = b - a and use Taylor’s theorem.

Proof 1

Expand f(x)f(x) around the midpoint c=(a+b)/2c = (a+b)/2 using Taylor’s theorem:

f(x)=f(c)+f(c)(xc)+f(η(x))2(xc)2f(x) = f(c) + f'(c)(x - c) + \frac{f''(\eta(x))}{2}(x - c)^2

Integrating from aa to bb:

abf(x)dx=hf(c)+f(η)2ab(xc)2dx=hf(c)+h324f(η)\int_a^b f(x)\,dx = hf(c) + \frac{f''(\eta)}{2}\int_a^b (x-c)^2\,dx = hf(c) + \frac{h^3}{24}f''(\eta)

where we used that ab(xc)dx=0\int_a^b (x-c)\,dx = 0 by symmetry, and ab(xc)2dx=h3/12\int_a^b (x-c)^2\,dx = h^3/12.

For the trapezoidal approximation, expand f(a)f(a) and f(b)f(b) around cc:

f(a)=f(c)h2f(c)+h28f(ξ1)f(a) = f(c) - \frac{h}{2}f'(c) + \frac{h^2}{8}f''(\xi_1)
f(b)=f(c)+h2f(c)+h28f(ξ2)f(b) = f(c) + \frac{h}{2}f'(c) + \frac{h^2}{8}f''(\xi_2)

Adding:

h2(f(a)+f(b))=hf(c)+h316f(ξ1)+f(ξ2)2\frac{h}{2}(f(a) + f(b)) = hf(c) + \frac{h^3}{16}\cdot\frac{f''(\xi_1) + f''(\xi_2)}{2}

Taking the difference and using the intermediate value theorem to combine the ff'' terms:

abf(x)dxh2(f(a)+f(b))=h312f(ξ)\int_a^b f(x)\,dx - \frac{h}{2}(f(a) + f(b)) = -\frac{h^3}{12}f''(\xi)

for some ξ(a,b)\xi \in (a, b).

Key observation: The local error is O(h3)O(h^3)—cubic in the interval width.

Composite Trapezoidal Rule

For better accuracy, divide [a,b][a, b] into nn subintervals of equal width h=(ba)/nh = (b-a)/n, with nodes xk=a+khx_k = a + kh for k=0,1,,nk = 0, 1, \ldots, n.

From Local to Global Error

The global error is the total error when approximating the integral over [a,b][a, b].

Proof 2

On each subinterval [xk1,xk][x_{k-1}, x_k], the local error is:

xk1xkf(x)dxh2(f(xk1)+f(xk))=h312f(ξk)\int_{x_{k-1}}^{x_k} f(x)\,dx - \frac{h}{2}(f(x_{k-1}) + f(x_k)) = -\frac{h^3}{12}f''(\xi_k)

for some ξk(xk1,xk)\xi_k \in (x_{k-1}, x_k).

Summing over all nn subintervals:

abf(x)dxTn(f)=h312k=1nf(ξk)\int_a^b f(x)\,dx - T_n(f) = -\frac{h^3}{12}\sum_{k=1}^{n} f''(\xi_k)

Since fC[a,b]f'' \in C[a,b], the sum 1nk=1nf(ξk)\frac{1}{n}\sum_{k=1}^{n} f''(\xi_k) lies between minf\min f'' and maxf\max f''. By the intermediate value theorem, there exists ξ(a,b)\xi \in (a, b) such that:

1nk=1nf(ξk)=f(ξ)\frac{1}{n}\sum_{k=1}^{n} f''(\xi_k) = f''(\xi)

Therefore:

abf(x)dxTn(f)=h312nf(ξ)=h3n12f(ξ)\int_a^b f(x)\,dx - T_n(f) = -\frac{h^3}{12} \cdot n \cdot f''(\xi) = -\frac{h^3 n}{12}f''(\xi)

Since n=(ba)/hn = (b-a)/h:

abf(x)dxTn(f)=(ba)h212f(ξ)\int_a^b f(x)\,dx - T_n(f) = -\frac{(b-a)h^2}{12}f''(\xi)

Understanding Local vs Global Error

Error TypeDefinitionTrapezoidal Rule
LocalError on one subinterval of width hhO(h3)O(h^3)
GlobalTotal error over [a,b][a, b]O(h2)O(h^2)

Why does the order drop from 3 to 2?

The global error accumulates local errors from n1/hn \sim 1/h subintervals:

Global errorn×Local error1h×h3=h2\text{Global error} \sim n \times \text{Local error} \sim \frac{1}{h} \times h^3 = h^2

This is the typical pattern: global order = local order − 1.

Remark 1 (Python Implementation)
def trapezoidal(f, a, b, n):
    """Composite trapezoidal rule with n subintervals."""
    h = (b - a) / n
    x = np.linspace(a, b, n + 1)
    y = f(x)
    return h * (0.5 * y[0] + np.sum(y[1:-1]) + 0.5 * y[-1])

Higher-Order Methods

By using higher-degree polynomial approximations, we can achieve better accuracy.

Simpson’s rule uses a quadratic polynomial through three points (a,f(a))(a, f(a)), (m,f(m))(m, f(m)), (b,f(b))(b, f(b)) where m=(a+b)/2m = (a+b)/2:

abf(x)dxh6(f(a)+4f(m)+f(b))\int_a^b f(x)\,dx \approx \frac{h}{6}\left(f(a) + 4f(m) + f(b)\right)

where h=bah = b - a. This has local error O(h5)O(h^5) and global error O(h4)O(h^4)—two orders better than trapezoidal.

Even higher-order Newton-Cotes formulas exist (using more equally-spaced points), though they become unstable for high orders. The optimal approach—Gaussian quadrature—chooses both the nodes and weights optimally and achieves remarkable efficiency. We will explore this in the chapter on interpolation.

Why Integration is Easier Than Differentiation

Remark 2 (Smoothing vs. Roughening)
OperationError behavior
DifferentiationErrors amplify (dividing by small hh)
IntegrationErrors average out (summing many terms)

This is why numerical integration is generally more stable than numerical differentiation. Integration “smooths,” differentiation “roughens.”

From the perspective of conditioning:

This is why we can often integrate noisy data reliably, but differentiating noisy data is notoriously difficult.

Summary

RuleLocal ErrorGlobal Error
TrapezoidalO(h3)O(h^3)O(h2)O(h^2)
Simpson’sO(h5)O(h^5)O(h4)O(h^4)

Key principle: Global order = local order − 1, because we sum O(1/h)O(1/h) local errors.