Richardson Extrapolation Applied Twice to Accelerate the Convergence of an Estimate
This Demonstration shows Richardson extrapolation applied twice to accelerate the convergence of an estimate.[more]
• top graph: select a function and a desired discretization in order to estimate the function's integral with the trapezoidal rule, using a total number of steps. The error of the estimate is calculated against the output of Mathematica's built-in function NIntegrate.
• middle row of graphs: Richardson extrapolation is applied twice to combine three separate approximations with , , and steps, respectively, and the error is calculated again. Notice that the total computational effort remains the same.
• bottom graph: when the absolute relative error moves below 1, Richardson extrapolation applied twice gives a better approximation. For any linear function, for example when you select , both approximations have a zero error; in this case, the outcome for the absolute relative error is 1.[less]
Let be an approximation of the exact value of the integral of that depends on a positive step size with an error formula of the form
where the are known constants. For step sizes and ,
To apply Richardson extrapolation twice, multiply the last two equations by and , respectively, and then, by adding all equations, the two error terms of the lowest order disappear:
Notice that the approximations and require the same computational effort, yet the errors are and , respectively.