Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Automatic differentiation
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Applications == Currently, for its efficiency and accuracy in computing first and higher order [[derivative]]s, auto-differentiation is a celebrated technique with diverse applications in [[scientific computing]] and [[mathematics]]. It should therefore come as no surprise that there are numerous computational implementations of auto-differentiation. Among these, one mentions [[INTLAB]], [[Sollya]], and [[InCLosure]].<ref name="Rump.1999">[[Siegfried M. Rump]] (1999). INTLAB–INTerval LABoratory. In T. Csendes, editor, Developments in Reliable Computing, pages 77–104. Kluwer Academic Publishers, Dordrecht.</ref><ref name="Chevillard.Joldes.Lauter.2010">S. Chevillard, M. Joldes, and C. Lauter. Sollya (2010). An Environment for the Development of Numerical Codes. In K. Fukuda, J. van der Hoeven, M. Joswig, and N. Takayama, editors, Mathematical Software - ICMS 2010, volume 6327 of Lecture Notes in Computer Science, pages 28–31, Heidelberg, Germany. Springer.</ref><ref name="Dawood.Inc4.2022">[[Hend Dawood]] (2022). [[InCLosure]] (Interval enCLosure)–A Language and Environment for Reliable Scientific Computing. Computer Software, Version 4.0. Department of Mathematics. Faculty of Science, Cairo University, Giza, Egypt, September 2022. url: https://doi.org/10.5281/zenodo.2702404.</ref> In practice, there are two types (modes) of algorithmic differentiation: a forward-type and a reversed-type.<ref name="Dawood.Megahed.2023"/><ref name="Dawood.Megahed.2019"/> Presently, the two types are highly correlated and complementary and both have a wide variety of applications in, e.g., non-linear [[optimization]], [[sensitivity analysis]], [[robotics]], [[machine learning]], [[computer graphics]], and [[computer vision]].<ref name="Dawood-attribution"/><ref name="Fries.2019">Christian P. Fries (2019). Stochastic Automatic Differentiation: Automatic Differentiation for Monte-Carlo Simulations. Quantitative Finance, 19(6):1043–1059. doi: 10.1080/14697688.2018.1556398. url: https://doi.org/10.1080/14697688.2018.1556398.</ref><ref name="Dawood.Megahed.2023"/><ref name="Dawood.Megahed.2019"/><ref name="Dawood.Dawood.2020">[[Hend Dawood]] and [[Yasser Dawood]] (2020). Universal Intervals: Towards a Dependency-Aware Interval Algebra. In S. Chakraverty, editor, Mathematical Methods in Interdisciplinary Sciences. chapter 10, pages 167–214. John Wiley & Sons, Hoboken, New Jersey. ISBN 978-1-119-58550-3. doi: 10.1002/9781119585640.ch10. url: https://doi.org/10.1002/9781119585640.ch10.</ref><ref name="Dawood.2014">[[Hend Dawood]] (2014). Interval Mathematics as a Potential Weapon against Uncertainty. In S. Chakraverty, editor, Mathematics of Uncertainty Modeling in the Analysis of Engineering and Science Problems. chapter 1, pages 1–38. IGI Global, Hershey, PA. ISBN 978-1-4666-4991-0.</ref> Automatic differentiation is particularly important in the field of [[machine learning]]. For example, it allows one to implement [[backpropagation]] in a [[neural network (machine learning)|neural network]] without a manually-computed derivative.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)