Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
List of algorithms
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Machine learning and statistical classification=== {{main|List of machine learning algorithms}} {{further|Machine learning|Statistical classification}} * [[Almeida–Pineda recurrent backpropagation]]: Adjust a matrix of synaptic weights to generate desired outputs given its inputs * [[ALOPEX]]: a correlation-based [[Machine learning|machine-learning algorithm]] * [[Association rule learning]]: discover interesting relations between variables, used in [[data mining]] ** [[Apriori algorithm]] ** [[Eclat algorithm]] ** [[Association rule learning#FP-growth algorithm|FP-growth algorithm]] ** [[One-attribute rule]] ** [[Association rule learning#Zero-attribute rule|Zero-attribute rule]] * [[Boosting (meta-algorithm)]]: Use many weak learners to boost effectiveness ** [[AdaBoost]]: adaptive boosting ** [[BrownBoost]]: a boosting algorithm that may be robust to noisy datasets ** [[LogitBoost]]: [[logistic regression]] boosting ** [[LPBoost]]: [[linear programming]] boosting * [[Bootstrap aggregating]] (bagging): technique to improve stability and classification accuracy * [[Computer Vision]] ** [[Grabcut]] based on [[Graph cuts in computer vision|Graph cuts]] * [[Decision tree learning|Decision Trees]] ** [[C4.5 algorithm]]: an extension to ID3 ** [[ID3 algorithm]] (Iterative Dichotomiser 3): use heuristic to generate small decision trees * [[Cluster analysis|Clustering]]: a class of [[unsupervised learning]] algorithms for grouping and bucketing related input vector * [[k-nearest neighbors]] (k-NN): a non-parametric method for classifying objects based on closest training examples in the [[feature space]] * [[Linde–Buzo–Gray algorithm]]: a vector quantization algorithm used to derive a good codebook * [[Locality-sensitive hashing]] (LSH): a method of performing probabilistic dimension reduction of high-dimensional data * [[Artificial neural network|Neural Network]] ** [[Backpropagation]]: a [[supervised learning]] method which requires a teacher that knows, or can calculate, the desired output for any given input ** [[Hopfield net]]: a [[Recurrent neural network]] in which all connections are symmetric ** [[Perceptron]]: the simplest kind of feedforward neural network: a [[linear classifier]]. ** [[Pulse-coupled neural networks]] (PCNN): [[Artificial neural network|Neural models]] proposed by modeling a cat's [[visual cortex]] and developed for high-performance [[Bionics|biomimetic]] image processing. ** [[Radial basis function network]]: an artificial neural network that uses radial [[basis function]]s as activation functions ** [[Self-organizing map]]: an unsupervised network that produces a low-dimensional representation of the input space of the training samples * [[Random forest]]: classify using many decision trees * [[Reinforcement learning]]: ** [[Q-learning]]: learns an action-value function that gives the expected utility of taking a given action in a given state and following a fixed policy thereafter ** [[State–action–reward–state–action|State–Action–Reward–State–Action]] (SARSA): learn a [[Markov decision process]] policy ** [[Temporal difference learning]] * [[relevance vector machine|Relevance-Vector Machine]] (RVM): similar to SVM, but provides probabilistic classification * [[Supervised learning]]: Learning by examples (labelled data-set split into training-set and test-set) * [[Support vector machine|Support Vector Machine]] (SVM): a set of methods which divide multidimensional data by finding a dividing hyperplane with the maximum margin between the two sets ** [[Structured SVM]]: allows training of a classifier for general structured output labels. * [[Winnow algorithm]]: related to the perceptron, but uses a [[Multiplicative Weight Update Method|multiplicative weight-update scheme]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)