from pylab import *
Suppose we know the values of a function at a set of discrete points, say in a tabular form
e.g., table of logarithms, sine, cosine, or perhaps the results of some experimental measurement. We may want to know the value of the function at some that is not present in the table. This is an interpolation problem.
Using the available information, we will first construct an approximation to the unknown function . Then
We can use to evaluate an approximation to at any value
We can use to compute approximations to derivatives or integrals of
To simplify the formulae, let us write
In fact this is true for any degree of the interpolating polynomial. Thus the interpolation problem leads to a unique polynomial, and does not depend on the choice of the basis function for the space of polynomials.
1Some questions¶
What basis functions to choose ?
polynomials: Easy to evaulate. But there are several choices, monomials, Lagrange, Newton, Chebychev, Legendre, etc.
rational polynomials
trigonometric functions: sines and cosines
radial basis functions
The basis functions may have compact support or be globally supported. Using globally supported functions gives high accuracy for smooth functions and leads to what are called spectral methods. Compactly supported basis functions are useful when functions are less regular, and are used in finite element methods.
How to sample the data, uniformly or otherwise ?
How to interpolate/approximate arbitrarily scattered data, especially in high dimensions ?
Interpolation/approximation can be performed using only function values, or using derivative information when available, leading to Hermite methods.
Should we exactly fit the data as in interpolation, or approximate it, e.g., using a least squares fit ? This is also a question of norms to measure the approximation error. Usually the data has some noise and then it may not be a good idea to interpolate.