Lecture 6 : Evaluation and interpolation

May, 2022 - François HU

Master of Science - EPITA

This lecture is available here: https://curiousml.github.io/

image.png

Table of contents

  1. Polynomial evaluation
    • Naive method
    • Horner's method
  1. Polynomial interpolation
    • Lagrange method
    • (Newton method)
  1. Exercices

1. Polynomial evaluation

Consider a polynomial of degree $k-1$ defined by its $k$ coefficients $a_i$:

$$ P(X) = \sum\limits_{i=0}^{k-1}a_iX^i $$

Our objective is to evaluate this polynomial. We present two ways to do this:

Naïve approach

Pseudo-code:

eval_polynomial(a, x)
    Input : list a = (a[0], ..., a[k-1]) and x the point value
    Output : the value of the polynomial P in x: V

    V = a0
    C = x
    for i=1 to k-1 do:
        V = V + ai * C
        C = C * x

This algorithm requires $2(k-1)$ multiplications and $k-1$ additions.

Horner method (first-order)

The idea is to decompose the polynomial so that we have:

$$ P(X) = a_0 + X(a_1 + X(a_2 + X(a_3+\dots+X(a_{k-1})))) $$

Pseudo-code:

horner(a, x)
    Input : list a = (a[0], ..., a[k-1]) and x the point value
    Output : the value of the polynomial P in x: V

    V = a[k-1]
    for i=k-2 to 0 do:
        V = ai + x * V

This algorithm requires $k-1$ multiplications and $k-1$ additions.

(optional) Horner method (second-order)

The idea is to decompose the polynomial so that we have:

$$ P(X) = \sum\limits_{i=0}^{k-1}a_iX^i = \sum\limits_{i=0}^{k/2-1}a_{2i}X^{2i} + X\cdot\sum\limits_{i=0}^{k/2-1}a_{2i+1}X^{2i+1} = P_0(X^2) + X\cdot P_1(X^2) \implies \text{ Parallelizable approach } $$

2. Polynomial interpolation

Polynomyal interpolation: given a set of $n + 1$ data points $(x_{0},y_{0}),\ldots ,(x_{n-1},y_{n-1})$, a polynomial function $p ( x ) $ is said to interpolate the data if $P(x_{i})=y_{i}$ for each $i\in \{0,1,\dotsc ,n-1\}$

Two common explicit formulas for this polynomial are the Lagrange polynomials (and Newton polynomials).

Lagrange polynomials

The polynomial interpolation in the Lagrange form is a linear combination:

$$ Q(X) = \sum\limits_{j=0}^{n-1} y_j P_j(x) $$

with $${\displaystyle Q _{j}(x):=\prod _{\begin{smallmatrix}0\leq m< n\\m\neq j\end{smallmatrix}}{\frac {x-x_{m}}{x_{j}-x_{m}}}={\frac {(x-x_{0})}{(x_{j}-x_{0})}}\cdots {\frac {(x-x_{j-1})}{(x_{j}-x_{j-1})}}{\frac {(x-x_{j+1})}{(x_{j}-x_{j+1})}}\cdots {\frac {(x-x_{n-1})}{(x_{j}-x_{n-1})}},} $$

image.png

image-2.png

(source of graphs: wikipedia)

3. Exercices