Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Mean value theorem
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Mean value theorem for vector-valued functions == There is no exact analog of the mean value theorem for vector-valued functions (see below). However, there is an inequality which can be applied to many of the same situations to which the mean value theorem is applicable in the one dimensional case:{{sfn|Rudin|1976|p=113}} {{math theorem|''For a continuous vector-valued function <math>\mathbf{f}:[a,b]\to\mathbb{R}^k</math> differentiable on <math>(a,b)</math>, there exists a number <math>c\in(a,b)</math> such that'' :<math>|\mathbf{f}(b)-\mathbf{f}(a)| \le (b-a)\left|\mathbf{f}'(c)\right|</math>.}} {{math proof| Take <math>\varphi(t) = (\textbf{f}(b) - \textbf{f}(a)) \cdot \textbf{f}(t)</math>. Then <math>\varphi</math> is real-valued and thus, by the mean value theorem, :<math>\varphi(b) - \varphi(a) = \varphi'(c)(b-a)</math> for some <math>c \in (a, b)</math>. Now, <math display="block">\varphi(b) - \varphi(a) = |\textbf{f}(b) - \textbf{f}(a)|^2</math> and <math display="block">\varphi'(c) = (\textbf{f}(b) - \textbf{f}(a)) \cdot \textbf{f}'(c).</math> Hence, using the [[Cauchy–Schwarz inequality]], from the above equation, we get: :<math>|\textbf{f}(b) - \textbf{f}(a)|^2 \le |\textbf{f}(b) - \textbf{f}(a)| |\textbf{f}'(c) |(b-a).</math> If <math>\textbf{f}(b) = \textbf{f}(a)</math>, the theorem holds trivially. Otherwise, dividing both sides by <math> |\textbf{f}(b) - \textbf{f}(a)|</math> yields the theorem.}} ===Mean value inequality=== {{see also|Calculus on Euclidean space#Derivative of a map and chain rule}} [[Jean Dieudonné]] in his classic treatise ''Foundations of Modern Analysis'' discards the mean value theorem and replaces it by mean inequality as the proof is not constructive and one cannot find the mean value and in applications one only needs mean inequality. [[Serge Lang]] in ''Analysis I ''uses the mean value theorem, in integral form, as an instant reflex but this use requires the continuity of the derivative. If one uses the [[Henstock–Kurzweil integral]] one can have the mean value theorem in integral form without the additional assumption that derivative should be continuous as every derivative is Henstock–Kurzweil integrable. The reason why there is no analog of mean value equality is the following: If {{math|''f'' : ''U'' → '''R'''<sup>''m''</sup>}} is a differentiable function (where {{math|''U'' ⊂ '''R'''<sup>''n''</sup>}} is open) and if {{math|''x'' + ''th''}}, {{math|''x'', ''h'' ∈ '''R'''<sup>''n''</sup>, ''t'' ∈ [0, 1]}} is the line segment in question (lying inside {{mvar|U}}), then one can apply the above parametrization procedure to each of the component functions {{math|1=''f<sub>i</sub>'' (''i'' = 1, …, ''m'')}} of ''f'' (in the above notation set {{math|1=''y'' = ''x'' + ''h''}}). In doing so one finds points {{math|''x'' + ''t<sub>i</sub>h''}} on the line segment satisfying :<math>f_i(x+h) - f_i(x) = \nabla f_i (x + t_ih) \cdot h.</math> But generally there will not be a ''single'' point {{math|''x'' + ''t''*''h''}} on the line segment satisfying :<math>f_i(x+h) - f_i(x) = \nabla f_i (x + t^* h) \cdot h.</math> for all {{mvar|i}} ''simultaneously''. For example, define: :<math>\begin{cases} f : [0, 2 \pi] \to \R^2 \\ f(x) = (\cos(x), \sin(x)) \end{cases}</math> Then <math>f(2\pi) - f(0) = \mathbf{0} \in \R^2</math>, but <math>f_1'(x) = -\sin (x)</math> and <math>f_2'(x) = \cos (x)</math> are never simultaneously zero as <math>x</math> ranges over <math>\left[0, 2 \pi\right]</math>. The above theorem implies the following: {{math_theorem|name=Mean value inequality<ref>{{harvnb|Hörmander|2015|loc=Theorem 1.1.1. and remark following it.}}</ref> |math_statement=For a continuous function <math>\textbf{f} : [a, b] \to \mathbb{R}^k</math>, if <math>\textbf{f}</math> is differentiable on <math>(a, b)</math>, then :<math>|\textbf{f}(b) - \textbf{f}(a)| \le (b-a)\sup_{(a, b)} |\textbf{f}'|</math>.}} In fact, the above statement suffices for many applications and can be proved directly as follows. (We shall write <math>f</math> for <math>\textbf{f}</math> for readability.) {{math proof|First assume <math>f</math> is differentiable at <math>a</math> too. If <math>f'</math> is unbounded on <math>(a, b)</math>, there is nothing to prove. Thus, assume <math>\sup_{(a, b)} |f'| < \infty</math>. Let <math>M > \sup_{(a, b)} |f'|</math> be some real number. Let <math display="block">E = \{ 0 \le t \le 1 \mid |f(a + t(b-a)) - f(a)| \le Mt(b-a) \}.</math> We want to show <math>1 \in E</math>. By continuity of <math>f</math>, the set <math>E</math> is closed. It is also nonempty as <math>0</math> is in it. Hence, the set <math>E</math> has the largest element <math>s</math>. If <math>s = 1</math>, then <math>1 \in E</math> and we are done. Thus suppose otherwise. For <math>1 > t > s</math>, :<math>\begin{align} &|f(a + t(b-a)) - f(a)| \\ &\le |f(a + t(b-a)) - f(a+s(b - a)) - f'(a + s(b-a))(t-s)(b-a)| + |f'(a+s(b-a))|(t-s)(b-a) \\ &+|f(a + s(b-a)) - f(a)|. \end{align} </math> Let <math>\epsilon > 0</math> be such that <math>M - \epsilon > \sup_{(a, b)} |f'|</math>. By the differentiability of <math>f</math> at <math>a + s(b-a)</math> (note <math>s</math> may be 0), if <math>t</math> is sufficiently close to <math>s</math>, the first term is <math>\le \epsilon (t-s)(b-a)</math>. The second term is <math>\le (M - \epsilon) (t-s)(b-a)</math>. The third term is <math>\le Ms(b-a)</math>. Hence, summing the estimates up, we get: <math>|f(a + t(b-a)) - f(a)| \le tM|b-a|</math>, a contradiction to the maximality of <math>s</math>. Hence, <math>1 = s \in M</math> and that means: :<math>|f(b) - f(a)| \le M(b-a).</math> Since <math>M</math> is arbitrary, this then implies the assertion. Finally, if <math>f</math> is not differentiable at <math>a</math>, let <math>a' \in (a, b)</math> and apply the first case to <math>f</math> restricted on <math>[a', b]</math>, giving us: :<math>|f(b) - f(a')| \le (b-a')\sup_{(a, b)} |f'|</math> since <math>(a', b) \subset (a, b)</math>. Letting <math>a' \to a</math> finishes the proof.}}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)