Skip to article frontmatterSkip to article content

Stability of polynomial interpolation

With barycentric interpolation available in the form of Function 9.2.1, we can explore polynomial interpolation using a numerically stable algorithm. Any remaining sensitivity to error is due to the conditioning of the interpolation process itself.

9.3.1Runge phenomenon

The disappointing loss of convergence in Example 9.3.1 is a sign of ill conditioning due to the use of equally spaced nodes. We will examine this effect using the error formula (9.1.6) as a guide:

f(x)p(x)=f(n+1)(ξ)(n+1)!Φ(x),Φ(x)=i=0n(xti).f(x) - p(x) = \frac{f^{(n+1)}(\xi)}{(n+1)!} \Phi(x), \qquad \Phi(x) = \prod_{i=0}^n (x-t_i).

While the dependence on ff is messy here, the error indicator Φ(x)\Phi(x) can be studied as a function of the nodes only.

Two observations from the result of Example 9.3.2 are important. First, Φ|\Phi| decreases exponentially at each fixed location in the interval (note that the spacing between curves is constant for constant increments of nn). Second, Φ|\Phi| is larger at the ends of the interval than in the middle, by an exponentially growing factor. This gap is what can ruin the convergence of polynomial interpolation.

The observation of instability in Example 9.3.3 is known as the Runge phenomenon. The Runge phenomenon is an instability manifested when the nodes of the interpolant are equally spaced and the degree of the polynomial increases. We reiterate that the phenomenon is rooted in the interpolation convergence theory and not a consequence of the algorithm chosen to implement polynomial interpolation.

Significantly, the convergence observed in Example 9.3.3 is stable within a middle portion of the interval. By redistributing the interpolation nodes, we will next sacrifice a little of the convergence in the middle portion in order to improve it near the ends and rescue the process globally.

9.3.2Chebyshev nodes

The observations above hint that we might find success by having more nodes near the ends of the interval than in the middle. Though we will not give the details, it turns out that there is a precise asymptotic sense in which this must be done to make polynomial interpolation work over the entire interval. One especially important node family that gives stable convergence for polynomial interpolation is the Chebyshev points of the second kind (or Chebyshev extreme points) defined by

tk=cos(kπn),k=0,,n. t_k = - \cos\left(\frac{k \pi}{n}\right), \qquad k=0,\ldots,n.

These are the projections onto the xx-axis of nn equally spaced points on a unit circle. They are densely clustered near the ends of [1,1][-1,1], and this feature turns out to overcome the Runge phenomenon.

As a bonus, for Chebyshev nodes the barycentric weights are simple:

wk=(1)kdk,dk={1/2if k=0 or k=n,1otherwise. w_k = (-1)^k d_k, \qquad d_k = \begin{cases} 1/2 & \text{if $k=0$ or $k=n$},\\ 1 & \text{otherwise}. \end{cases}

9.3.3Spectral convergence

If we take nn\rightarrow \infty and use polynomial interpolation on Chebyshev nodes, the convergence rate is exponential in nn. The following is typical of the results that can be proved.

The condition “ff is analytic” means that the Taylor series of ff converges to f(x)f(x) in an open interval containing [1,1][-1,1].[1] A necessary condition of analyticity is that ff is infinitely differentiable.

In other contexts we refer to (9.3.4) as linear convergence, but here it is usual to say that the rate is exponential or that one has spectral convergence. It achieves constant reduction factors in the error by constant increments of nn. By contrast, algebraic convergence in the form O(np)O(n^{-p}) for some p>0p>0 requires multiplying nn by a constant factor in order to reduce error by a constant factor. Graphically, spectral error is a straight line on a log-linear scale, while algebraic convergence is a straight line on a log-log scale.

9.3.4Exercises

Footnotes
  1. Alternatively, analyticity means that the function is extensible to one that is differentiable in the complex plane.