Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
List of algorithms
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Optimization algorithms=== {{main|Mathematical optimization}}[[Hybrid algorithm|Hybrid]] Algorithms * [[Alpha–beta pruning]]: search to reduce number of nodes in minimax algorithm * [[Branch and bound]] * [[Bruss algorithm]]: see [[odds algorithm]] * [[Chain matrix multiplication]] * [[Combinatorial optimization]]: optimization problems where the set of feasible solutions is discrete ** [[Greedy randomized adaptive search procedure]] (GRASP): successive constructions of a greedy randomized solution and subsequent iterative improvements of it through a local search ** [[Hungarian method]]: a combinatorial optimization algorithm which solves the [[assignment problem]] in polynomial time * [[Constraint satisfaction]]{{anchor|Constraint satisfaction}} ** General algorithms for the constraint satisfaction *** [[AC-3 algorithm]] *** [[Difference map algorithm]] *** [[Min conflicts algorithm]] ** [[Chaff algorithm]]: an algorithm for solving instances of the [[Boolean satisfiability problem]] ** [[Davis–Putnam algorithm]]: check the validity of a first-order logic formula ** [[DPLL algorithm|Davis–Putnam–Logemann–Loveland algorithm]] (DPLL): an algorithm for deciding the satisfiability of propositional logic formula in [[conjunctive normal form]], i.e. for solving the [[CNF-SAT]] problem ** [[Exact cover]] problem *** [[Algorithm X]]: a [[nondeterministic algorithm]] *** [[Dancing Links]]: an efficient implementation of Algorithm X * [[Cross-entropy method]]: a general Monte Carlo approach to combinatorial and continuous multi-extremal optimization and [[importance sampling]] * [[Differential evolution]] * [[Dynamic Programming]]: problems exhibiting the properties of [[overlapping subproblem]]s and [[optimal substructure]] * [[Ellipsoid method]]: is an algorithm for solving convex optimization problems * [[Evolutionary computation]]: optimization inspired by biological mechanisms of evolution ** [[Evolution strategy]] ** [[Gene expression programming]] ** [[Genetic algorithms]] *** [[Fitness proportionate selection]] – also known as roulette-wheel selection *** [[Stochastic universal sampling]] *** [[Truncation selection]] *** [[Tournament selection]] ** [[Memetic algorithm]] ** [[Swarm intelligence]] *** [[Ant colony optimization]] *** [[Bees algorithm]]: a search algorithm which mimics the food foraging behavior of swarms of honey bees *** [[Particle swarm optimization|Particle swarm]] * [[Frank-Wolfe algorithm]]: an iterative first-order optimization algorithm for constrained convex optimization * [[Golden-section search]]: an algorithm for finding the maximum of a real function * [[Gradient descent]] * [[Hyperparameter optimization#Grid search|Grid Search]] * [[Harmony search]] (HS): a [[metaheuristic]] algorithm mimicking the improvisation process of musicians * [[Interior point method]] * {{anchor|Linear programming}}[[Linear programming]] ** [[Benson's algorithm]]: an algorithm for solving linear [[vector optimization]] problems ** [[Dantzig–Wolfe decomposition]]: an algorithm for solving linear programming problems with special structure ** [[Delayed column generation]] ** [[Integer linear programming]]: solve linear programming problems where some or all the unknowns are restricted to integer values *** [[Branch and cut]] *** [[Cutting-plane method]] ** [[Karmarkar's algorithm]]: The first reasonably efficient algorithm that solves the [[linear programming]] problem in [[polynomial time]]. ** [[Simplex algorithm]]: an algorithm for solving [[linear programming]] problems * [[Line search]] * [[Local search (optimization)|Local search]]: a metaheuristic for solving computationally hard optimization problems ** [[Random-restart hill climbing]] ** [[Tabu search]] * [[Minimax#Minimax algorithm with alternate moves|Minimax]] used in game programming * [[Nearest neighbor search]] (NNS): find closest points in a [[metric space]] ** [[Best Bin First]]: find an approximate solution to the [[nearest neighbor search]] problem in very-high-dimensional spaces * [[Newton's method in optimization]] * [[Nonlinear optimization]] ** [[BFGS method]]: a [[nonlinear optimization]] algorithm ** [[Gauss–Newton algorithm]]: an algorithm for solving nonlinear [[least squares]] problems ** [[Levenberg–Marquardt algorithm]]: an algorithm for solving nonlinear [[least squares]] problems ** [[Nelder–Mead method]] (downhill simplex method): a [[nonlinear optimization]] algorithm * [[Odds algorithm]] (Bruss algorithm): Finds the optimal strategy to predict a last specific event in a random sequence event * [[Hyperparameter optimization#Random search|Random Search]] * [[Simulated annealing]] * [[Stochastic tunneling]] * [[Subset sum problem|Subset sum]] algorithm * [[A hybrid HS-LS conjugate]] [[gradient algorithm]] (see https://doi.org/10.1016/j.cam.2023.115304) * [[A hybrid BFGS-Like method]] (see more https://doi.org/10.1016/j.cam.2024.115857) * [[Conjugate gradient method]]s (see more https://doi.org/10.1016/j.jksus.2022.101923)
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)