Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Convex function
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Properties == Many properties of convex functions have the same simple formulation for functions of many variables as for functions of one variable. See below the properties for the case of many variables, as some of them are not listed for functions of one variable. === Functions of one variable === * Suppose <math>f</math> is a function of one [[real number|real]] variable defined on an interval, and let <math display=block>R(x_1, x_2) = \frac{f(x_2) - f(x_1)}{x_2 - x_1}</math> (note that <math>R(x_1, x_2)</math> is the slope of the purple line in the first drawing; the function <math>R</math> is [[Symmetric function|symmetric]] in <math>(x_1, x_2),</math> means that <math>R</math> does not change by exchanging <math>x_1</math> and <math>x_2</math>). <math>f</math> is convex if and only if <math>R(x_1, x_2)</math> is [[monotonically non-decreasing]] in <math>x_1,</math> for every fixed <math>x_2</math> (or vice versa). This characterization of convexity is quite useful to prove the following results. * A convex function <math>f</math> of one real variable defined on some [[open interval]] <math>C</math> is [[Continuous function|continuous]] on <math>C </math>. Moreover, <math>f</math> admits [[Semi-differentiability|left and right derivatives]], and these are [[monotonically non-decreasing]]. In addition, the left derivative is left-continuous and the right-derivative is right-continuous. As a consequence, <math>f</math> is [[differentiable function|differentiable]] at all but at most [[countable|countably many]] points, the set on which <math>f</math> is not differentiable can however still be dense. If <math>C</math> is closed, then <math>f</math> may fail to be continuous at the endpoints of <math>C</math> (an example is shown in the [[#Examples|examples section]]). * A [[differentiable function|differentiable]] function of one variable is convex on an interval if and only if its [[derivative]] is [[monotonically non-decreasing]] on that interval. If a function is differentiable and convex then it is also [[continuously differentiable]]. * A differentiable function of one variable is convex on an interval if and only if its graph lies above all of its [[tangent]]s:<ref name="boyd">{{cite book| title=Convex Optimization| first1=Stephen P.|last1=Boyd |first2=Lieven| last2=Vandenberghe | year = 2004 |publisher=Cambridge University Press| isbn=978-0-521-83378-3| url= https://web.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf#page=83 |format=pdf | access-date=October 15, 2011}}</ref>{{rp|69}} <math display=block>f(x) \geq f(y) + f'(y) (x-y)</math> for all <math>x</math> and <math>y</math> in the interval. * A twice differentiable function of one variable is convex on an interval if and only if its [[second derivative]] is non-negative there; this gives a practical test for convexity. Visually, a twice differentiable convex function "curves up", without any bends the other way ([[inflection point]]s). If its second derivative is positive at all points then the function is strictly convex, but the [[converse (logic)|converse]] does not hold. For example, the second derivative of <math>f(x) = x^4</math> is <math>f''(x) = 12x^{2}</math>, which is zero for <math>x = 0,</math> but <math>x^4</math> is strictly convex. **This property and the above property in terms of "...its derivative is monotonically non-decreasing..." are not equal since if <math>f''</math> is non-negative on an interval <math>X</math> then <math>f'</math> is monotonically non-decreasing on <math>X</math> while its converse is not true, for example, <math>f'</math> is monotonically non-decreasing on <math>X</math> while its derivative <math>f''</math> is not defined at some points on <math>X</math>. * If <math>f</math> is a convex function of one real variable, and <math>f(0)\le 0</math>, then <math>f</math> is [[Superadditivity|superadditive]] on the [[positive reals]], that is <math>f(a + b) \geq f(a) + f(b)</math> for positive real numbers <math>a</math> and <math>b</math>. {{math proof|proof= Since <math>f</math> is convex, by using one of the convex function definitions above and letting <math>x_2 = 0,</math> it follows that for all real <math>0 \leq t \leq 1,</math> <math display=block> \begin{align} f(tx_1) & = f(t x_1 + (1-t) \cdot 0) \\ & \leq t f(x_1) + (1-t) f(0) \\ & \leq t f(x_1). \\ \end{align} </math> From <math>f(tx_1)\leq t f(x_1)</math>, it follows that <math display=block> \begin{align} f(a) + f(b) & = f \left((a+b) \frac{a}{a+b} \right) + f \left((a+b) \frac{b}{a+b} \right) \\ & \leq \frac{a}{a+b} f(a+b) + \frac{b}{a+b} f(a+b) \\ & = f(a+b).\\ \end{align}</math> Namely, <math>f(a) + f(b) \leq f(a+b)</math>. }} * A function <math>f</math> is midpoint convex on an interval <math>C</math> if for all <math>x_1, x_2 \in C</math> <math display=block>f\!\left(\frac{x_1 + x_2}{2}\right) \leq \frac{f(x_1) + f(x_2)}{2}.</math> This condition is only slightly weaker than convexity. For example, a real-valued [[Lebesgue measurable function]] that is midpoint-convex is convex: this is a theorem of [[Wacław Sierpiński|Sierpiński]].<ref>{{cite book|last=Donoghue|first=William F.| title= Distributions and Fourier Transforms|year=1969|publisher=Academic Press | isbn=9780122206504 |url= https://books.google.com/books?id=P30Y7daiGvQC&pg=PA12|access-date=August 29, 2012|page=12}}</ref> In particular, a continuous function that is midpoint convex will be convex. === Functions of several variables === * A function that is marginally convex in each individual variable is not necessarily (jointly) convex. For example, the function <math>f(x, y) = x y</math> is [[bilinear map|marginally linear]], and thus marginally convex, in each variable, but not (jointly) convex. * A function <math>f : X \to [-\infty, \infty]</math> valued in the [[extended real numbers]] <math>[-\infty, \infty] = \R \cup \{\pm\infty\}</math> is convex if and only if its [[Epigraph (mathematics)|epigraph]] <math display=block>\{(x, r) \in X \times \R ~:~ r \geq f(x)\}</math> is a convex set. * A differentiable function <math>f</math> defined on a convex domain is convex if and only if <math>f(x) \geq f(y) + \nabla f(y)^T \cdot (x-y)</math> holds for all <math>x, y</math> in the domain. * A twice differentiable function of several variables is convex on a convex set if and only if its [[Hessian matrix]] of second [[partial derivative]]s is [[Positive-definite matrix|positive semidefinite]] on the interior of the convex set. * For a convex function <math>f,</math> the [[sublevel set]]s <math>\{x : f(x) < a\}</math> and <math>\{x : f(x) \leq a\}</math> with <math>a \in \R</math> are convex sets. A function that satisfies this property is called a '''{{em|[[quasiconvex function]]}}''' and may fail to be a convex function. * Consequently, the set of [[Arg min|global minimisers]] of a convex function <math>f</math> is a convex set: <math>{\operatorname{argmin}}\,f</math> - convex. * Any [[local minimum]] of a convex function is also a [[global minimum]]. A {{em|strictly}} convex function will have at most one global minimum.<ref>{{cite web | url=https://math.stackexchange.com/q/337090 | title=If f is strictly convex in a convex set, show it has no more than 1 minimum | publisher=Math StackExchange | date=21 Mar 2013 | access-date=14 May 2016}}</ref> * [[Jensen's inequality]] applies to every convex function <math>f</math>. If <math>X</math> is a random variable taking values in the domain of <math>f,</math> then <math>\operatorname{E}(f(X)) \geq f(\operatorname{E}(X)),</math> where <math>\operatorname{E}</math> denotes the [[Expected value|mathematical expectation]]. Indeed, convex functions are exactly those that satisfies the hypothesis of [[Jensen's inequality]]. * A first-order [[homogeneous function]] of two positive variables <math>x</math> and <math>y,</math> (that is, a function satisfying <math>f(a x, a y) = a f(x, y)</math> for all positive real <math>a, x, y > 0</math>) that is convex in one variable must be convex in the other variable.<ref>Altenberg, L., 2012. Resolvent positive linear operators exhibit the reduction phenomenon. Proceedings of the National Academy of Sciences, 109(10), pp.3705-3710.</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)