Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Bernstein polynomial
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
====Bernstein's proof==== Suppose ''K'' is a [[random variable]] distributed as the number of successes in ''n'' independent [[Bernoulli trial]]s with probability ''x'' of success on each trial; in other words, ''K'' has a [[binomial distribution]] with parameters ''n'' and ''x''. Then we have the [[expected value]] <math>\operatorname{\mathcal E}\left[\frac{K}{n}\right] = x\ </math> and :<math>p(K) = {n \choose K} x^{K} \left( 1 - x \right)^{n - K} = b_{K,n}(x)</math> By the [[law of large numbers|weak law of large numbers]] of [[probability theory]], :<math>\lim_{n \to \infty}{ P\left( \left| \frac{K}{n} - x \right|>\delta \right) } = 0</math> for every ''δ'' > 0. Moreover, this relation holds uniformly in ''x'', which can be seen from its proof via [[Chebyshev's inequality]], taking into account that the variance of {{frac|1|''n''}} ''K'', equal to {{frac|1|''n''}} ''x''(1−''x''), is bounded from above by {{frac|1|(4''n'')}} irrespective of ''x''. Because ''ƒ'', being continuous on a closed bounded interval, must be [[uniform continuity|uniformly continuous]] on that interval, one infers a statement of the form :<math>\lim_{n \to \infty}{ P\left( \left| f\left( \frac{K}{n} \right) - f\left( x \right) \right| > \varepsilon \right) } = 0</math> uniformly in ''x'' for each <math>\epsilon > 0</math>. Taking into account that ''ƒ'' is bounded (on the given interval) one finds that : <math>\lim_{n \to \infty}{ \operatorname{\mathcal E}\left( \left| f\left( \frac{K}{n} \right) - f\left( x \right) \right| \right) } = 0</math> uniformly in ''x''. To justify this statement, we use a common method in probability theory to convert from closeness in probability to closeness in expectation. One splits the expectation of <math>\left| f\left( \frac{K}{n} \right) - f\left( x \right) \right|</math> into two parts split based on whether or not <math>\left| f\left( \frac{K}{n} \right) - f\left( x \right) \right| < \epsilon</math>. In the interval where the difference does not exceed ''ε'', the expectation clearly cannot exceed ''ε''. In the other interval, the difference still cannot exceed 2''M'', where ''M'' is an upper bound for |''ƒ''(x)| (since uniformly continuous functions are bounded). However, by our 'closeness in probability' statement, this interval cannot have probability greater than ''ε''. Thus, this part of the expectation contributes no more than 2''M'' times ''ε''. Then the total expectation is no more than <math>\epsilon + 2M\epsilon</math>, which can be made arbitrarily small by choosing small ''ε''. Finally, one observes that the absolute value of the difference between expectations never exceeds the expectation of the absolute value of the difference, a consequence of Holder's Inequality. Thus, using the above expectation, we see that (uniformly in ''x'') : <math>\lim_{n \to \infty}{ \left| \operatorname{\mathcal E}f\left( \frac{K}{n} \right) - \operatorname{\mathcal E}f\left( x \right) \right| } \leq \lim_{n \to \infty}{ \operatorname{\mathcal E}\left( \left| f\left( \frac{K}{n} \right) - f\left( x \right) \right| \right) } = 0</math> Noting that our randomness was over ''K'' while ''x'' is constant, the expectation of ''f(x)'' is just equal to ''f(x)''. But then we have shown that <math>\operatorname{\mathcal E_x}f\left( \frac{K}{n} \right)</math> converges to ''f(x)''. Then we will be done if <math>\operatorname{\mathcal E_x}f\left( \frac{K}{n} \right)</math> is a polynomial in ''x'' (the subscript reminding us that ''x'' controls the distribution of ''K''). Indeed it is: :<math>\operatorname{\mathcal E_x}\left[f\left(\frac{K}{n}\right)\right] = \sum_{K=0}^n f\left(\frac{K}{n}\right) p(K) = \sum_{K=0}^n f\left(\frac{K}{n}\right) b_{K,n}(x) = B_n(f)(x)</math>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)