Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Constant of integration
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Origin== The [[derivative]] of any constant function is zero. Once one has found one antiderivative <math>F(x)</math> for a function <math>f(x),</math> adding or subtracting any constant <math>C</math> will give us another antiderivative, because <math display="inline">\frac{d}{dx}(F(x) + C) = \frac{d}{dx}F(x) + \frac{d}{dx}C = F'(x) = f(x) .</math> The constant is a way of expressing that every function with at least one antiderivative will have an infinite number of them. Let <math>F:\R\to\R</math> and <math>G:\R\to\R</math> be two everywhere differentiable functions. Suppose that <math>F\,'(x) = G\,'(x)</math> for every real number ''x''. Then there exists a real number <math>C</math> such that <math>F(x) - G(x) = C</math> for every real number ''x''. To prove this, notice that <math>[F(x) - G(x)]' = 0 .</math> So <math>F</math> can be replaced by <math>F-G,</math> and <math>G</math> by the constant function <math>0,</math> making the goal to prove that an everywhere differentiable function whose derivative is always zero must be constant: Choose a real number <math>a,</math> and let <math>C = F(a) .</math> For any ''x'', the [[fundamental theorem of calculus]], together with the assumption that the derivative of <math>F</math> vanishes, implying that <math display="block">\begin{align} & 0= \int_a^x F'(t)\,dt\\ & 0= F(x)-F(a) \\ & 0= F(x)-C \\ & F(x)=C \\ \end{align}</math> thereby showing that <math>F</math> is a constant function. Two facts are crucial in this proof. First, the real line is [[Connected space|connected]]. If the real line were not connected, one would not always be able to integrate from our fixed ''a'' to any given ''x''. For example, if one were to ask for functions defined on the union of intervals [0,1] and [2,3], and if ''a'' were 0, then it would not be possible to integrate from 0 to 3, because the function is not defined between 1 and 2. Here, there will be ''two'' constants, one for each [[Connected space|connected component]] of the [[Function domain|domain]]. In general, by replacing constants with [[locally constant function]]s, one can extend this theorem to disconnected domains. For example, there are two constants of integration for <math display="inline">\int dx/x</math>, and infinitely many for <math display="inline">\int \tan x\,dx</math>, so for example, the general form for the integral of 1/''x'' is:<ref>"[http://golem.ph.utexas.edu/category/2012/03/reader_survey_logx_c.html Reader Survey: log|''x''| + ''C'']", Tom Leinster, ''The ''n''-category CafΓ©'', March 19, 2012</ref><ref>{{cite book|title=The calculus lifesaver : all the tools you need to excel at calculus|url=https://archive.org/details/isbn_9780691130880|url-access=registration|last1=Banner|first1=Adrian|date=2007|publisher=Princeton University Press|isbn=978-0-691-13088-0|location=Princeton [u.a.]|page=[https://archive.org/details/isbn_9780691130880/page/380 380]}}</ref> <math display="block">\int \frac{dx}{x} = \begin{cases} \ln \left|x \right| + C^- & x < 0\\ \ln \left|x \right| + C^+ & x > 0 \end{cases}</math> Second, <math>F</math> and <math>G</math> were assumed to be everywhere differentiable. If <math>F</math> and <math>G</math> are not differentiable at even one point, then the theorem might fail. As an example, let <math>F(x)</math> be the [[Heaviside step function]], which is zero for negative values of ''x'' and one for non-negative values of ''x'', and let <math>G(x) = 0 .</math> Then the derivative of <math>F</math> is zero where it is defined, and the derivative of <math>G</math> is always zero. Yet it's clear that <math>F</math> and <math>G</math> do not differ by a constant, even if it is assumed that <math>F</math> and <math>G</math> are everywhere continuous and [[almost everywhere]] differentiable the theorem still fails. As an example, take <math>F</math> to be the [[Cantor function]] and again let <math>G = 0 .</math> It turns out that adding and subtracting constants is the only flexibility available in finding different antiderivatives of the same function. That is, all antiderivatives are the same up to a constant. To express this fact for <math>\cos(x),</math> one can write: <math display="block">\int \cos(x)\,dx = \sin(x) + C,</math> where <math>C</math> is '''constant of integration'''. It is easily determined that all of the following functions are antiderivatives of <math>\cos(x)</math>: <math display="block">\begin{align} \frac{d}{dx}[\sin(x) + C] &= \frac{d}{dx} \sin(x) + \frac{d}{dx}C \\ &= \cos(x) + 0 \\ &= \cos(x) \end{align}</math>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)