Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Root-finding algorithm
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Bracketing methods == Bracketing methods determine successively smaller intervals (brackets) that contain a root. When the interval is small enough, then a root is considered found. These generally use the [[intermediate value theorem]], which asserts that if a continuous function has values of opposite signs at the end points of an interval, then the function has at least one root in the interval. Therefore, they require starting with an interval such that the function takes opposite signs at the end points of the interval. However, in the case of [[polynomial]]s there are other methods such as [[Descartes' rule of signs]], [[Budan's theorem]] and [[Sturm's theorem]] for bounding or determining the number of roots in an interval. They lead to efficient algorithms for [[real-root isolation]] of polynomials, which find all real roots with a guaranteed accuracy. === Bisection method === The simplest root-finding algorithm is the [[bisection method]]. Let {{math|''f''}} be a [[continuous function]] for which one knows an interval {{math|[''a'', ''b'']}} such that {{math|''f''(''a'')}} and {{math|''f''(''b'')}} have opposite signs (a bracket). Let {{math|1=''c'' = (''a'' + ''b'')/2}} be the middle of the interval (the midpoint or the point that bisects the interval). Then either {{math|''f''(''a'')}} and {{math|''f''(''c'')}}, or {{math|''f''(''c'')}} and {{math|''f''(''b'')}} have opposite signs, and one has divided by two the size of the interval. Although the bisection method is robust, it gains one and only one [[bit]] of accuracy with each iteration. Therefore, the number of function evaluations required for finding an ''Ξ΅''-approximate root is <math>\log_2\frac{b-a}{\varepsilon}</math>. Other methods, under appropriate conditions, can gain accuracy faster. === False position (''regula falsi'') === The [[false position method]], also called the ''regula falsi'' method, is similar to the bisection method, but instead of using bisection search's middle of the interval it uses the [[x-intercept|{{math|''x''}}-intercept]] of the line that connects the plotted function values at the endpoints of the interval, that is :<math>c= \frac{af(b)-bf(a)}{f(b)-f(a)}.</math> False position is similar to the [[secant method]], except that, instead of retaining the last two points, it makes sure to keep one point on either side of the root. The false position method can be faster than the bisection method and will never diverge like the secant method. However, it may fail to converge in some naive implementations due to roundoff errors that may lead to a wrong sign for {{math|''f''(''c'')}}. Typically, this may occur if the [[derivative]] of {{mvar|f}} is large in the neighborhood of the root.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)