Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Polynomial interpolation
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Newton Interpolation === {{See also|Newton polynomial|Divided differences}} ==== Theorem ==== For a polynomial <math>p_n</math> of degree less than or equal to <math>n</math>, that interpolates <math>f</math> at the nodes <math>x_i</math> where <math>i = 0,1,2,3,\cdots,n</math>. Let <math>p_{n+1}</math> be the polynomial of degree less than or equal to <math>n+1</math> that interpolates <math>f</math> at the nodes <math>x_i</math> where <math>i = 0,1,2,3,\cdots,n, n+1</math>. Then <math>p_{n+1}</math> is given by:<math display="block">p_{n+1}(x) = p_n(x) +a_{n+1}w_n(x) </math>where <math display="inline">w_n(x) := \prod_{i=0}^n (x-x_i) </math> also known as Newton basis and <math display="inline">a_{n+1} :={f(x_{n+1})-p_n(x_{n+1}) \over w_n(x_{n+1})} </math>. '''Proof:''' This can be shown for the case where <math>i = 0,1,2,3,\cdots,n</math>:<math display="block">p_{n+1}(x_i) = p_n(x_i) +a_{n+1}\prod_{j=0}^n (x_i-x_j) = p_n(x_i) </math>and when <math>i = n+1</math>:<math display="block">p_{n+1}(x_{n+1}) = p_n(x_{n+1}) +{f(x_{n+1})-p_n(x_{n+1}) \over w_n(x_{n+1})} w_n(x_{n+1}) = f(x_{n+1}) </math>By the uniqueness of interpolated polynomials of degree less than <math>n+1</math>, <math display="inline">p_{n+1}(x) = p_n(x) +a_{n+1}w_n(x) </math> is the required polynomial interpolation. The function can thus be expressed as: <math display="inline">p_{n}(x) = a_0+a_1(x-x_0)+a_2(x-x_0)(x-x_1)+\cdots + a_n(x-x_0)\cdots(x-x_{n-1}) .</math> ==== Polynomial coefficients ==== To find <math>a_i</math>, we have to solve the [[lower triangular matrix]] formed by arranging <math display="inline">p_{n} (x_i)=f(x_i)=y_i </math> from above equation in matrix form: : <math>\begin{bmatrix} 1 & & \ldots & & 0 \\ 1 & x_1-x_0 & & & \\ 1 & x_2-x_0 & (x_2-x_0)(x_2-x_1) & & \vdots \\ \vdots & \vdots & & \ddots & \\ 1 & x_k-x_0 & \ldots & \ldots & \prod_{j=0}^{n-1}(x_n - x_j) \end{bmatrix} \begin{bmatrix} a_0 \\ \\ \vdots \\ \\ a_{n} \end{bmatrix} = \begin{bmatrix} y_0 \\ \\ \vdots \\ \\ y_{n} \end{bmatrix}</math> The coefficients are derived as : <math>a_j := [y_0,\ldots,y_j]</math> where : <math>[y_0,\ldots,y_j]</math> is the notation for [[divided differences]]. Thus, [[Newton polynomial]]s are used to provide a polynomial interpolation formula of n points.<ref name="Epperson 2013"/> {| class="toccolours collapsible collapsed" width="80%" style="text-align:left" !Proof |- | The first few coefficients can be calculated using the system of equations. The form of n-th coefficient is assumed for proof by mathematical induction. <math>\begin{align} a_0 &= y_0 = [y_0] \\ a_1 &= {y_1-y_0 \over x_1 - x_0}= [y_0,y_1] \\ \vdots\\ a_n &= [y_0,\cdots,y_n] \quad \text{(let)}\\ \end{align} </math> Let Q be polynomial interpolation of points <math>(x_1, y_1), \ldots, (x_n, y_n)</math>. Adding <math>(x_0,y_0)</math> to the polynomial Q: <math>Q(x)+ a'_n (x - x_1)\cdot\ldots\cdot(x - x_n) = P_{n}(x), </math> where <math display="inline">a'_n(x_{0} - x_1)\ldots(x_{0}-x_{n}) = y_{0} - Q(x_{0}) </math>. By uniqueness of the interpolating polynomial of the points <math>(x_0, y_0), \ldots, (x_n, y_n)</math>, equating the coefficients of <math>x^{n-1}</math> we get, <math display="inline">a'_n=[y_0, \ldots, y_{n}] </math>. Hence the polynomial can be expressed as:<math>P_{n}(x)= Q(x)+ [y_0,\ldots,y_n](x - x_1)\cdot\ldots\cdot(x - x_n).</math> Adding <math>(x_{n+1},y_{n+1})</math> to the polynomial Q, it has to satisfiy: <math display="inline">[y_1, \ldots,y_{n+1}](x_{n+1} - x_1)\cdot\ldots\cdot(x_{n+1}-x_{n}) = y_{n+1} - Q(x_{n+1}) </math> where the formula for <math display="inline">a_n </math> and interpolating polynomial are used. The <math display="inline">a_{n+1} </math> term for the polynomial <math display="inline">P_{n+1} </math> can be found by calculating:<math display="block">\begin{align} & [y_0,\ldots,y_{n+1}](x_{n+1} - x_0)\cdot\ldots\cdot(x_{n+1} - x_n)\\ &= \frac{[y_1,\ldots,y_{n+1}] - [y_0,\ldots,y_{n}]}{x_{n+1} - x_0}(x_{n+1} - x_0)\cdot\ldots\cdot(x_{n+1} - x_n) \\ &= \left([y_1,\ldots,y_{n+1}] - [y_0,\ldots,y_{n}]\right) (x_{n+1} - x_1)\cdot\ldots\cdot(x_{n+1} - x_n) \\ &= [y_1,\ldots,y_{n+1}](x_{n+1} - x_1)\cdot\ldots\cdot(x_{n+1} - x_n) - [y_0,\ldots,y_n](x_{n+1} - x_1)\cdot\ldots\cdot(x_{n+1} - x_n) \\ &= (y_{n+1} - Q(x_{n+1})) - [y_0,\ldots,y_n](x_{n+1} - x_1)\cdot\ldots\cdot(x_{n+1} - x_n) \\ &= y_{n+1} - (Q(x_{n+1}) + [y_0,\ldots,y_n](x_{n+1} - x_1)\cdot\ldots\cdot(x_{n+1} - x_n))\\ &=y_{n+1}-P(x_{n+1}). \end{align} </math>which implies that <math>a_{n+1}={y_{n+1}-P_n(x_{n+1}) \over w_n(x_{n+1})} = [y_0,\ldots,y_{n+1}]</math>. Hence it is proved by principle of mathematical induction. |} ==== Newton forward formula ==== The Newton polynomial can be expressed in a simplified form when <math>x_0, x_1, \dots, x_k</math> are arranged consecutively with equal spacing. If <math>x_0, x_1, \dots, x_k</math> are consecutively arranged and equally spaced with <math>{x}_{i}={x}_{0}+ih </math> for ''i'' = 0, 1, ..., ''k'' and some variable x is expressed as <math>{x}={x}_{0}+sh</math>, then the difference <math>x-x_i</math> can be written as <math>(s-i)h</math>. So the Newton polynomial becomes : <math>\begin{align} N(x) &= [y_0] + [y_0,y_1]sh + \cdots + [y_0,\ldots,y_k] s (s-1) \cdots (s-k+1){h}^{k} \\ &= \sum_{i=0}^{k}s(s-1) \cdots (s-i+1){h}^{i}[y_0,\ldots,y_i] \\ &= \sum_{i=0}^{k}{s \choose i}i!{h}^{i}[y_0,\ldots,y_i]. \end{align}</math> Since the relationship between divided differences and [[Divided differences#Forward and backward differences|forward differences]] is given as:<ref>{{cite book |last1=Burden |first1=Richard L. |url=https://archive.org/details/numericalanalysi00rlbu |title=Numerical Analysis |last2=Faires |first2=J. Douglas |date=2011 |isbn=9780538733519 |edition=9th |page=[https://archive.org/details/numericalanalysi00rlbu/page/n146 129] |publisher=Cengage Learning |url-access=limited}}</ref><math display="block">[y_j, y_{j+1}, \ldots , y_{j+n}] = \frac{1}{n!h^n}\Delta^{(n)}y_j,</math>Taking <math>y_i=f(x_i)</math>, if the representation of x in the previous sections was instead taken to be <math>x=x_j+sh</math>, the '''Newton forward interpolation formula''' is expressed as:<math display="block">f(x) \approx N(x)=N(x_j+sh) = \sum_{i=0}^{k}{s \choose i}\Delta^{(i)} f(x_j) </math>which is the interpolation of all points after <math>x_j</math>. It is expanded as:<math display="block">f(x_j+sh)=f(x_j)+\frac{s}{1!}\Delta f(x_j)+ \frac{s(s-1)}{2!}\Delta^2 f(x_j)+\frac{s(s-1)(s-2)}{3!}\Delta^3 f(x_j)+\frac{s(s-1)(s-2)(s-3)}{4!}\Delta^4 f(x_j)+\cdots </math> ==== Newton backward formula ==== If the nodes are reordered as <math>{x}_{k},{x}_{k-1},\dots,{x}_{0}</math>, the Newton polynomial becomes : <math>N(x)=[y_k]+[{y}_{k}, {y}_{k-1}](x-{x}_{k})+\cdots+[{y}_{k},\ldots,{y}_{0}](x-{x}_{k})(x-{x}_{k-1})\cdots(x-{x}_{1}).</math> If <math>{x}_{k},\;{x}_{k-1},\;\dots,\;{x}_{0}</math> are equally spaced with <math>{x}_{i}={x}_{k}-(k-i)h</math> for ''i'' = 0, 1, ..., ''k'' and <math>{x}={x}_{k}+sh</math>, then, : <math>\begin{align} N(x) &= [{y}_{k}]+ [{y}_{k}, {y}_{k-1}]sh+\cdots+[{y}_{k},\ldots,{y}_{0}]s(s+1)\cdots(s+k-1){h}^{k} \\ &=\sum_{i=0}^{k}{(-1)}^{i}{-s \choose i}i!{h}^{i}[{y}_{k},\ldots,{y}_{k-i}]. \end{align}</math> Since the relationship between divided differences and backward differences is given as:{{Citation needed|date=December 2023}}<math display="block">[{y}_{j}, y_{j-1},\ldots,{y}_{j-n}] = \frac{1}{n!h^n}\nabla^{(n)}y_j, </math>taking <math>y_i=f(x_i)</math>, if the representation of x in the previous sections was instead taken to be <math>x=x_j+sh</math>, the '''Newton backward interpolation formula''' is expressed as:<math display="block">f(x) \approx N(x) =N(x_j+sh)=\sum_{i=0}^{k}{(-1)}^{i}{-s \choose i}\nabla^{(i)} f(x_j). </math>which is the interpolation of all points before <math>x_j</math>. It is expanded as:<math display="block">f(x_j+sh)=f(x_j)+\frac{s}{1!}\nabla f(x_j)+ \frac{s(s+1)}{2!}\nabla^2 f(x_j)+\frac{s(s+1)(s+2)}{3!}\nabla^3 f(x_j)+\frac{s(s+1)(s+2)(s+3)}{4!}\nabla^4 f(x_j)+\cdots </math>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)