Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
List of algorithms
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Computational mathematics== {{further|Computational mathematics}} {{see also|List of algorithms#Combinatorial algorithms|l1=Combinatorial algorithms|List of algorithms#Computational science|l2=Computational science}} ===Abstract algebra=== {{further|Abstract algebra}} * [[Chien search]]: a recursive algorithm for determining roots of polynomials defined over a finite field * [[SchreierâSims algorithm]]: computing a base and [[strong generating set]] (BSGS) of a [[permutation group]] * [[ToddâCoxeter algorithm]]: Procedure for generating [[coset]]s. ===Computer algebra=== {{further|Computer algebra}} * [[Buchberger's algorithm]]: finds a [[Gröbner basis]] * [[CantorâZassenhaus algorithm]]: factor polynomials over finite fields * [[FaugĂšre F4 algorithm]]: finds a Gröbner basis (also mentions the F5 algorithm) * [[Gosper's algorithm]]: find sums of hypergeometric terms that are themselves hypergeometric terms * [[KnuthâBendix completion algorithm]]: for [[rewriting]] rule systems * [[Multivariate division algorithm]]: for [[polynomial]]s in several indeterminates * [[Pollard's kangaroo algorithm]] (also known as Pollard's lambda algorithm): an algorithm for solving the discrete logarithm problem * [[Polynomial long division]]: an algorithm for dividing a polynomial by another polynomial of the same or lower degree * [[Risch algorithm]]: an algorithm for the calculus operation of indefinite integration (i.e. finding [[antiderivatives]]) ===Geometry=== {{main category|Geometric algorithms}} {{further|Computational geometry}} * [[Closest pair problem]]: find the pair of points (from a set of points) with the smallest distance between them * [[Collision detection]] algorithms: check for the collision or intersection of two given solids * [[Cone algorithm]]: identify surface points * [[Convex hull algorithms]]: determining the [[convex hull]] of a [[Set (mathematics)|set]] of points ** [[Graham scan]] ** [[Quickhull]] ** [[Gift wrapping algorithm]] or Jarvis march ** [[Chan's algorithm]] ** [[KirkpatrickâSeidel algorithm]] * [[Euclidean distance map|Euclidean distance transform]]: computes the distance between every point in a grid and a discrete collection of points. * [[Geometric hashing]]: a method for efficiently finding two-dimensional objects represented by discrete points that have undergone an [[affine transformation]] * [[GilbertâJohnsonâKeerthi distance algorithm]]: determining the smallest distance between two [[convex set|convex]] shapes. * [[Jump-and-Walk algorithm]]: an algorithm for point location in triangulations * [[Laplacian smoothing]]: an algorithm to smooth a polygonal mesh * [[Line segment intersection]]: finding whether lines intersect, usually with a [[sweep line algorithm]] ** [[BentleyâOttmann algorithm]] ** [[ShamosâHoey algorithm]] * [[Minimum bounding box algorithms]]: find the [[Minimum bounding box#Arbitrarily oriented minimum bounding box|oriented minimum bounding box]] enclosing a set of points * [[Nearest neighbor search]]: find the nearest point or points to a query point * [[Nesting algorithm]]: make the most efficient use of material or space * [[Point in polygon]] algorithms: tests whether a given point lies within a given polygon * [[Point set registration]] algorithms: finds the transformation between two [[point cloud|point sets]] to optimally align them. * [[Rotating calipers]]: determine all [[antipodal point|antipodal]] pairs of points and vertices on a [[convex polygon]] or [[convex hull]]. * [[Shoelace algorithm]]: determine the area of a polygon whose vertices are described by ordered pairs in the plane * [[Triangulation (geometry)|Triangulation]] ** [[Delaunay triangulation]] *** [[Ruppert's algorithm]] (also known as Delaunay refinement): create quality Delaunay triangulations *** [[Chew's second algorithm]]: create quality [[constrained Delaunay triangulation]]s ** [[Marching triangles]]: reconstruct two-dimensional surface geometry from an unstructured [[point cloud]] ** [[Polygon triangulation]] algorithms: decompose a polygon into a set of triangles ** [[Voronoi diagram]]s, geometric [[duality (mathematics)|dual]] of [[Delaunay triangulation]] *** [[BowyerâWatson algorithm]]: create voronoi diagram in any number of dimensions *** [[Fortune's Algorithm]]: create voronoi diagram ** [[Quasitriangulation]] ===Number theoretic algorithms=== {{further|Number theory}} * [[Binary GCD algorithm]]: Efficient way of calculating GCD. * [[Booth's multiplication algorithm]] * [[Chakravala method]]: a cyclic algorithm to solve indeterminate quadratic equations, including [[Pell's equation]] * [[Discrete logarithm]]: ** [[Baby-step giant-step]] ** [[Index calculus algorithm]] ** [[Pollard's rho algorithm for logarithms]] ** [[PohligâHellman algorithm]] * [[Euclidean algorithm]]: computes the [[greatest common divisor]] * [[Extended Euclidean algorithm]]: also solves the equation ''ax'' + ''by'' = ''c'' * [[Integer factorization]]: breaking an integer into its [[prime number|prime]] factors ** [[Congruence of squares]] ** [[Dixon's algorithm]] ** [[Fermat's factorization method]] ** [[General number field sieve]] ** [[Lenstra elliptic curve factorization]] ** [[Pollard's p − 1 algorithm|Pollard's ''p'' â 1 algorithm]] ** [[Pollard's rho algorithm]] ** [[prime factorization algorithm]] ** [[Quadratic sieve]] ** [[Shor's algorithm]] ** [[Special number field sieve]] ** [[Trial division]] * [[Multiplication algorithm]]s: fast multiplication of two numbers ** [[Karatsuba algorithm]] ** [[SchönhageâStrassen algorithm]] ** [[ToomâCook multiplication]] * [[Modular square root]]: computing square roots modulo a prime number ** [[TonelliâShanks algorithm]] ** [[Cipolla's algorithm]] ** [[Berlekamp's root finding algorithm]] * [[Odlyzko–Schönhage algorithm]]: calculates nontrivial zeroes of the [[Riemann zeta function]] * [[LenstraâLenstraâLovĂĄsz lattice basis reduction algorithm|LenstraâLenstraâLovĂĄsz algorithm]] (also known as LLL algorithm): find a short, nearly orthogonal [[Lattice (group)|lattice]] [[Basis (linear algebra)|basis]] in polynomial time * [[Primality test]]s: determining whether a given number is [[prime number|prime]] ** [[AKS primality test]] ** [[BaillieâPSW primality test]] ** [[Fermat primality test]] ** [[Lucas primality test]] ** [[Miller–Rabin primality test]] ** [[Sieve of Atkin]] ** [[Sieve of Eratosthenes]] ** [[Sieve of Sundaram]] ===Numerical algorithms=== {{further|Numerical analysis|List of numerical analysis topics}} ====Differential equation solving==== {{further|Differential equation}} * [[Euler method]] * [[Backward Euler method]] * [[Trapezoidal rule (differential equations)]] * [[Linear multistep method]]s * [[RungeâKutta methods]] ** [[Euler integration]] * [[Multigrid method]]s (MG methods), a group of algorithms for solving differential equations using a hierarchy of discretizations * [[Partial differential equation]]: ** [[Finite difference method]] ** [[CrankâNicolson method]] for diffusion equations ** [[LaxâWendroff method|LaxâWendroff]] for wave equations * [[Verlet integration]] ({{IPA|fr|vÉÊËlÉ}}): integrate Newton's equations of motion ====Elementary and special functions==== {{further|Special functions}} * [[Computing Ï|Computation of Ï]]: ** [[Borwein's algorithm]]: an algorithm to calculate the value of 1/Ï ** [[GaussâLegendre algorithm]]: computes the digits of [[pi]] ** [[Chudnovsky algorithm]]: a fast method for calculating the digits of Ï ** [[BaileyâBorweinâPlouffe formula]]: (BBP formula) a spigot algorithm for the computation of the nth binary digit of Ï * [[Division algorithm]]s: for computing quotient and/or remainder of two numbers ** [[Long division]] ** [[Restoring division]] ** [[Non-restoring division]] ** [[SRT division]] ** [[NewtonâRaphson division]]: uses [[Newton's method]] to find the [[Multiplicative inverse|reciprocal]] of D, and multiply that reciprocal by N to find the final quotient Q. ** [[Goldschmidt division]] * Hyperbolic and Trigonometric Functions: ** [[BKM algorithm]]: computes [[Elementary function (differential algebra)|elementary functions]] using a table of logarithms ** [[CORDIC]]: computes hyperbolic and trigonometric functions using a table of arctangents * Exponentiation: ** [[Addition-chain exponentiation]]: exponentiation by positive integer powers that requires a minimal number of multiplications ** [[Exponentiating by squaring]]: an algorithm used for the fast computation of [[Arbitrary-precision arithmetic|large integer]] powers of a number * [[Montgomery reduction]]: an algorithm that allows [[modular arithmetic]] to be performed efficiently when the modulus is large * [[Multiplication algorithm]]s: fast multiplication of two numbers ** [[Booth's multiplication algorithm]]: a multiplication algorithm that multiplies two signed binary numbers in two's complement notation ** [[FĂŒrer's algorithm]]: an integer multiplication algorithm for very large numbers possessing a very low [[Computational complexity theory|asymptotic complexity]] ** [[Karatsuba algorithm]]: an efficient procedure for multiplying large numbers ** [[SchönhageâStrassen algorithm]]: an asymptotically fast multiplication algorithm for large integers ** [[ToomâCook multiplication]]: (Toom3) a multiplication algorithm for large integers * [[Multiplicative inverse#Algorithms|Multiplicative inverse Algorithms]]: for computing a number's multiplicative inverse (reciprocal). ** [[Newton's method#Multiplicative inverses of numbers and power series|Newton's method]] * [[Rounding functions]]: the classic ways to round numbers * [[Spigot algorithm]]: a way to compute the value of a [[mathematical constant]] without knowing preceding digits * Square and Nth root of a number: ** [[Alpha max plus beta min algorithm]]: an approximation of the square-root of the sum of two squares ** [[Methods of computing square roots]] ** [[Nth root algorithm|''n''th root algorithm]] * Summation: ** [[Binary splitting]]: a [[Divide and conquer algorithm|divide and conquer]] technique which speeds up the numerical evaluation of many types of series with rational terms ** [[Kahan summation algorithm]]: a more accurate method of summing floating-point numbers * [[Unrestricted algorithm]] ====Geometric==== * [[Radon transform#Radon inversion formula|Filtered back-projection]]: efficiently computes the inverse 2-dimensional [[Radon transform]]. * [[Level set method]] (LSM): a numerical technique for tracking interfaces and shapes ====Interpolation and extrapolation==== {{further|Interpolation|Extrapolation}} * [[Birkhoff interpolation]]: an extension of polynomial interpolation * [[Cubic interpolation]] * [[Hermite interpolation]] * [[Lagrange interpolation]]: interpolation using [[Lagrange polynomial]]s * [[Linear interpolation]]: a method of curve fitting using linear polynomials * [[Monotone cubic interpolation]]: a variant of cubic interpolation that preserves monotonicity of the data set being interpolated. * [[Multivariate interpolation]] ** [[Bicubic interpolation]]: a generalization of [[cubic interpolation]] to two dimensions ** [[Bilinear interpolation]]: an extension of [[linear interpolation]] for interpolating functions of two variables on a regular grid ** [[Lanczos resampling]] ("Lanzosh"): a multivariate interpolation method used to compute new values for any digitally sampled data ** [[Nearest-neighbor interpolation]] ** [[Tricubic interpolation]]: a generalization of [[cubic interpolation]] to three dimensions * [[Pareto interpolation]]: a method of estimating the median and other properties of a population that follows a [[Pareto distribution]]. * [[Polynomial interpolation]] ** [[Neville's algorithm]] * [[Spline interpolation]]: Reduces error with [[Runge's phenomenon]]. ** [[De Boor algorithm]]: [[B-spline]]s ** [[De Casteljau's algorithm]]: [[BĂ©zier curve]]s * [[Trigonometric interpolation]] ====Linear algebra==== {{further|Numerical linear algebra}} * Krylov methods (for large sparse matrix problems; third most-important numerical method class of the 20th century as ranked by SISC; after fast-fourier and fast-multipole) * [[Eigenvalue algorithm]]s ** [[Arnoldi iteration]] ** [[Inverse iteration]] ** [[Jacobi eigenvalue algorithm|Jacobi method]] ** [[Lanczos iteration]] ** [[Power iteration]] ** [[QR algorithm]] ** [[Rayleigh quotient iteration]] * [[GramâSchmidt process]]: orthogonalizes a set of vectors * [[Matrix multiplication algorithm]]s ** [[Cannon's algorithm]]: a [[distributed algorithm]] for [[matrix multiplication]] especially suitable for computers laid out in an N Ă N mesh ** [[CoppersmithâWinograd algorithm]]: square [[matrix multiplication]] ** [[Freivalds' algorithm]]: a randomized algorithm used to verify matrix multiplication ** [[Strassen algorithm]]: faster [[matrix multiplication]] {{anchor|Solving systems of linear equations}} * Solving [[system of linear equations|systems of linear equations]] ** [[Biconjugate gradient method]]: solves systems of linear equations ** [[Conjugate gradient]]: an algorithm for the numerical solution of particular systems of linear equations ** [[Gaussian elimination]] ** [[GaussâJordan elimination]]: solves systems of linear equations ** [[GaussâSeidel method]]: solves systems of linear equations iteratively ** [[Levinson recursion]]: solves equation involving a [[Toeplitz matrix]] ** [[Stone's method]]: also known as the strongly implicit procedure or SIP, is an algorithm for solving a sparse linear system of equations ** [[Successive over-relaxation]] (SOR): method used to speed up convergence of the [[GaussâSeidel method]] ** [[Tridiagonal matrix algorithm]] (Thomas algorithm): solves systems of tridiagonal equations * [[Sparse matrix]] algorithms ** [[CuthillâMcKee algorithm]]: reduce the [[bandwidth (matrix theory)|bandwidth]] of a [[symmetric sparse matrix]] ** [[Minimum degree algorithm]]: permute the rows and columns of a [[symmetric sparse matrix]] before applying the [[Cholesky decomposition]] ** [[Symbolic Cholesky decomposition]]: Efficient way of storing [[sparse matrix]] ====Monte Carlo==== {{further|Monte Carlo method}} * [[Gibbs sampling]]: generates a sequence of samples from the joint probability distribution of two or more random variables * [[Hybrid Monte Carlo]]: generates a sequence of samples using [[Hamiltonian dynamics|Hamiltonian]] weighted [[Markov chain Monte Carlo]], from a probability distribution which is difficult to sample directly. * [[MetropolisâHastings algorithm]]: used to generate a sequence of samples from the [[probability distribution]] of one or more variables * [[Wang and Landau algorithm]]: an extension of [[MetropolisâHastings algorithm]] sampling ====Numerical integration==== {{further|Numerical integration}} * [[MISER algorithm]]: Monte Carlo simulation, [[numerical integration]] ====Root finding==== {{main|Root-finding algorithm}} * [[Bisection method]] * [[False position method]]: and Illinois method: 2-point, bracketing * [[Halley's method]]: uses first and second derivatives * [[ITP Method|ITP method]]: minmax optimal and superlinear convergence simultaneously * [[Muller's method]]: 3-point, quadratic interpolation * [[Newton's method]]: finds zeros of functions with [[calculus]] * [[Ridder's method]]: 3-point, exponential scaling * [[Secant method]]: 2-point, 1-sided ===Optimization algorithms=== {{main|Mathematical optimization}}[[Hybrid algorithm|Hybrid]] Algorithms * [[Alphaâbeta pruning]]: search to reduce number of nodes in minimax algorithm * [[Branch and bound]] * [[Bruss algorithm]]: see [[odds algorithm]] * [[Chain matrix multiplication]] * [[Combinatorial optimization]]: optimization problems where the set of feasible solutions is discrete ** [[Greedy randomized adaptive search procedure]] (GRASP): successive constructions of a greedy randomized solution and subsequent iterative improvements of it through a local search ** [[Hungarian method]]: a combinatorial optimization algorithm which solves the [[assignment problem]] in polynomial time * [[Constraint satisfaction]]{{anchor|Constraint satisfaction}} ** General algorithms for the constraint satisfaction *** [[AC-3 algorithm]] *** [[Difference map algorithm]] *** [[Min conflicts algorithm]] ** [[Chaff algorithm]]: an algorithm for solving instances of the [[Boolean satisfiability problem]] ** [[DavisâPutnam algorithm]]: check the validity of a first-order logic formula ** [[DPLL algorithm|DavisâPutnamâLogemannâLoveland algorithm]] (DPLL): an algorithm for deciding the satisfiability of propositional logic formula in [[conjunctive normal form]], i.e. for solving the [[CNF-SAT]] problem ** [[Exact cover]] problem *** [[Algorithm X]]: a [[nondeterministic algorithm]] *** [[Dancing Links]]: an efficient implementation of Algorithm X * [[Cross-entropy method]]: a general Monte Carlo approach to combinatorial and continuous multi-extremal optimization and [[importance sampling]] * [[Differential evolution]] * [[Dynamic Programming]]: problems exhibiting the properties of [[overlapping subproblem]]s and [[optimal substructure]] * [[Ellipsoid method]]: is an algorithm for solving convex optimization problems * [[Evolutionary computation]]: optimization inspired by biological mechanisms of evolution ** [[Evolution strategy]] ** [[Gene expression programming]] ** [[Genetic algorithms]] *** [[Fitness proportionate selection]] â also known as roulette-wheel selection *** [[Stochastic universal sampling]] *** [[Truncation selection]] *** [[Tournament selection]] ** [[Memetic algorithm]] ** [[Swarm intelligence]] *** [[Ant colony optimization]] *** [[Bees algorithm]]: a search algorithm which mimics the food foraging behavior of swarms of honey bees *** [[Particle swarm optimization|Particle swarm]] * [[Frank-Wolfe algorithm]]: an iterative first-order optimization algorithm for constrained convex optimization * [[Golden-section search]]: an algorithm for finding the maximum of a real function * [[Gradient descent]] * [[Hyperparameter optimization#Grid search|Grid Search]] * [[Harmony search]] (HS): a [[metaheuristic]] algorithm mimicking the improvisation process of musicians * [[Interior point method]] * {{anchor|Linear programming}}[[Linear programming]] ** [[Benson's algorithm]]: an algorithm for solving linear [[vector optimization]] problems ** [[DantzigâWolfe decomposition]]: an algorithm for solving linear programming problems with special structure ** [[Delayed column generation]] ** [[Integer linear programming]]: solve linear programming problems where some or all the unknowns are restricted to integer values *** [[Branch and cut]] *** [[Cutting-plane method]] ** [[Karmarkar's algorithm]]: The first reasonably efficient algorithm that solves the [[linear programming]] problem in [[polynomial time]]. ** [[Simplex algorithm]]: an algorithm for solving [[linear programming]] problems * [[Line search]] * [[Local search (optimization)|Local search]]: a metaheuristic for solving computationally hard optimization problems ** [[Random-restart hill climbing]] ** [[Tabu search]] * [[Minimax#Minimax algorithm with alternate moves|Minimax]] used in game programming * [[Nearest neighbor search]] (NNS): find closest points in a [[metric space]] ** [[Best Bin First]]: find an approximate solution to the [[nearest neighbor search]] problem in very-high-dimensional spaces * [[Newton's method in optimization]] * [[Nonlinear optimization]] ** [[BFGS method]]: a [[nonlinear optimization]] algorithm ** [[GaussâNewton algorithm]]: an algorithm for solving nonlinear [[least squares]] problems ** [[LevenbergâMarquardt algorithm]]: an algorithm for solving nonlinear [[least squares]] problems ** [[NelderâMead method]] (downhill simplex method): a [[nonlinear optimization]] algorithm * [[Odds algorithm]] (Bruss algorithm): Finds the optimal strategy to predict a last specific event in a random sequence event * [[Hyperparameter optimization#Random search|Random Search]] * [[Simulated annealing]] * [[Stochastic tunneling]] * [[Subset sum problem|Subset sum]] algorithm * [[A hybrid HS-LS conjugate]] [[gradient algorithm]] (see https://doi.org/10.1016/j.cam.2023.115304) * [[A hybrid BFGS-Like method]] (see more https://doi.org/10.1016/j.cam.2024.115857) * [[Conjugate gradient method]]s (see more https://doi.org/10.1016/j.jksus.2022.101923)
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)