Skip to article frontmatterSkip to article content

7Approximation

from pylab import *
from scipy.interpolate import barycentric_interpolate

7.1Weirstrass Theorem

Weirstrass Theorem says that we can approximate continuous functions by polynomials.

Hence given any error tolerance ϵ>0\epsilon > 0, there exists a polynomial pp such that

maxx[a,b]f(x)p(x)ϵ\max_{x\in[a,b]}|f(x) - p(x)| \le \epsilon

i.e., polynomials can provide uniform approximation. This is only an existence result and does not give us a polynomial with a desired error bound. It also does not even tell us the degree of the polynomial.


The proof uses Bernstein polynomials. Take f:[0,1]Rf : [0,1] \to \re a bounded function and define

pn(x)=k=0n(nk)xk(1x)nkf(k/n),x[0,1]p_n(x) = \sum_{k=0}^n \binom{n}{k} x^k (1-x)^{n-k} f(k/n), \qquad x \in [0,1]

If ff is continuous at xx then

limnpn(x)=f(x)\lim_{n \to \infty} p_n(x) = f(x)

If ff is continuous at all x[0,1]x \in [0,1] then pnfp_n \to f uniformly in [0,1][0,1], i.e.,

limnmax0x1f(x)pn(x)=0\lim_{n \to \infty} \max_{0 \le x \le 1}|f(x) - p_n(x)| = 0

We have written pnp_n in terms of Bernstein polynomials

Brn(x)=(nr)xr(1x)nrand(nk)=n!k!(nk)!B_r^n(x) = \binom{n}{r} x^r (1-x)^{n-r} \qquad \textrm{and} \qquad \binom{n}{k} = \frac{n!}{k! (n-k)!}

7.2Taylor expansion

The Taylor expansion provides another way to approximate a function in a neighbourhood of some point at which the expansion is written. A Taylor expansion will exactly recover a polynomial function so we consider somewhat more non-trivial but still simple function. E.g.

f(x)=exp(x),x[1,1]f(x) = \exp(x), \qquad x \in [-1,1]

The third degree Taylor expansion around x=0x=0 is

p3(x)=1+x+12x2+16x3p_3(x) = 1 + x + \half x^2 + \frac{1}{6} x^3

We can estimate the error also from the remainder term

exp(x)p3(x)=124x4exp(ξ),ξ between 0 and x\exp(x) - p_3(x) = \frac{1}{24} x^4 \exp(\xi), \qquad \textrm{$\xi$ between $0$ and $x$}

and we get the following estimate

124x4exp(x)p3(x)e24x4,0x1\frac{1}{24} x^4 \le \exp(x) - p_3(x) \le \frac{\ee}{24} x^4, \qquad 0 \le x \le 1
124ex4exp(x)p3(x)124x4,1x0\frac{1}{24\ee} x^4 \le \exp(x) - p_3(x) \le \frac{1}{24} x^4, \qquad -1 \le x \le 0

The maximum norm of the error is

max1x1exp(x)p3(x)0.0516\max_{-1 \le x \le 1}|\exp(x) - p_3(x)| \approx 0.0516

The remainder term does not give a sharp estimate. Plotting the error,

x = linspace(-1,1,100)
f = lambda x: exp(x)
p3 = lambda x: 1 + x + x**2/2 + x**3/6
e  = f(x) - p3(x)
print("Max error = ", abs(e).max())
plot(x,e)
title('Error in cubic Taylor expansion')
grid(True), xlabel('x'), ylabel('$f(x)-p_3(x)$');
Max error =  0.05161516179237857
<Figure size 640x480 with 1 Axes>

we see that it is not uniformly distributed in the interval [1,1][-1,1]. We have very good approximation near x=0x=0 but poor approximation near the end-points of the interval.

7.3Cubic interpolation

We can get much better cubic polynomial approximation than the Taylor polynomial. Let us interpolate a cubic polynomial at {1,13,+13,+1}\{-1, -\frac{1}{3}, +\frac{1}{3}, +1\} which yields

p3(x)=0.99519577+0.99904923x+0.54788486x2+0.17615196x3p_3(x) = 0.99519577 + 0.99904923 x + 0.54788486 x^2 + 0.17615196 x^3

The error in this approximation is shown below.

xi = linspace(-1,1,4)
yi = f(xi)
y  = barycentric_interpolate(xi,yi,x)
e  = f(x) - y
print("Max error = ",abs(e).max())
plot(x,e, xi, 0*xi, 'o')
title('Error in cubic interpolation')
grid(True), xlabel('x'), ylabel('$f(x)-p_3(x)$');
Max error =  0.009983970795226949
<Figure size 640x480 with 1 Axes>

The maximum error is approximately 0.00998 which is almost five times smaller than the Taylor polynomial. The error is also more uniformly distributed and oscillates in sign.

References
  1. Davis, P. J. (1963). Interpolation and Approximation. Dover Publications.