Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Pattern recognition
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Algorithms== Algorithms for pattern recognition depend on the type of label output, on whether learning is supervised or unsupervised, and on whether the algorithm is statistical or non-statistical in nature. Statistical algorithms can further be categorized as [[generative model|generative]] or [[discriminative model|discriminative]]. ===Classification methods (methods predicting categorical labels)=== {{Main|Statistical classification}} Parametric:<ref>Assuming known distributional shape of feature distributions per class, such as the [[Gaussian distribution|Gaussian]] shape.</ref> *[[Linear discriminant analysis]] *[[Quadratic classifier|Quadratic discriminant analysis]] *[[Maximum entropy classifier]] (aka [[logistic regression]], [[multinomial logistic regression]]): Note that logistic regression is an algorithm for classification, despite its name. (The name comes from the fact that logistic regression uses an extension of a linear regression model to model the probability of an input being in a particular class.) Nonparametric:<ref>No distributional assumption regarding shape of feature distributions per class.</ref> *[[Decision tree]]s, [[decision list]]s *[[Variable kernel density estimation#Use for statistical classification|Kernel estimation]] and [[K-nearest-neighbor]] algorithms *[[Naive Bayes classifier]] *[[Artificial neural network|Neural networks]] (multi-layer perceptrons) *[[Perceptron]]s *[[Support vector machine]]s *[[Gene expression programming]] ===Clustering methods (methods for classifying and predicting categorical labels)=== {{Main|Cluster analysis}} *Categorical [[mixture model]]s *[[Hierarchical clustering]] (agglomerative or divisive) *[[K-means clustering]] *[[Correlation clustering]]<!-- not an algorithm --> *[[Kernel principal component analysis]] (Kernel PCA)<!-- but not PCA? --> ===Ensemble learning algorithms (supervised meta-algorithms for combining multiple learning algorithms together)=== {{Main|Ensemble learning}} *[[Boosting (meta-algorithm)]] *[[Bootstrap aggregating]] ("bagging") *[[Ensemble averaging]] *[[Mixture of experts]], [[hierarchical mixture of experts]] ===General methods for predicting arbitrarily-structured (sets of) labels=== *[[Bayesian network]]s *[[Markov random field]]s ===Multilinear subspace learning algorithms (predicting labels of multidimensional data using tensor representations)=== Unsupervised: *[[Multilinear principal component analysis]] (MPCA) ===Real-valued sequence labeling methods (predicting sequences of real-valued labels)=== {{Main|sequence labeling}} *[[Kalman filter]]s *[[Particle filter]]s ===Regression methods (predicting real-valued labels)=== {{Main|Regression analysis}} *[[Gaussian process regression]] (kriging) *[[Linear regression]] and extensions *[[Independent component analysis]] (ICA) *[[Principal components analysis]] (PCA) ===Sequence labeling methods (predicting sequences of categorical labels)=== *[[Conditional random field]]s (CRFs) *[[Hidden Markov model]]s (HMMs) *[[Maximum entropy Markov model]]s (MEMMs) *[[Recurrent neural networks]] (RNNs) *[[Dynamic time warping]] (DTW) {{cleanup list|date=May 2014}}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)