Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Feature selection
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Feature selection embedded in learning algorithms== Some learning algorithms perform feature selection as part of their overall operation. These include: * {{tmath|l_1}}-regularization techniques, such as sparse regression, LASSO, and {{tmath|l_1}}-SVM * Regularized trees,<ref name="DengRunger2012" /> e.g. regularized random forest implemented in the RRF package<ref name="RRF" /> * [[Decision tree learning|Decision tree]]<ref>R. Kohavi and G. John, "[https://ai.stanford.edu/~ronnyk/wrappersPrint.pdf Wrappers for feature subset selection]", ''[[Artificial Intelligence (journal)|Artificial intelligence]]'' 97.1-2 (1997): 273-324</ref> * [[Memetic algorithm]] * [[Random multinomial logit]] (RMNL) * [[Autoencoder|Auto-encoding]] networks with a bottleneck-layer * [[Submodular set function|Submodular]] feature selection<ref>{{cite arXiv|eprint=1102.3975|last1=Das|first1=Abhimanyu|title=Submodular meets Spectral: Greedy Algorithms for Subset Selection, Sparse Approximation and Dictionary Selection|last2=Kempe|first2=David|class=stat.ML|year=2011}}</ref><ref>Liu et al., [http://melodi.ee.washington.edu/~bilmes/mypubs/liu-submodfeature2013-icassp.pdf Submodular feature selection for high-dimensional acoustic score spaces] {{Webarchive|url=https://web.archive.org/web/20151017122628/http://melodi.ee.washington.edu/~bilmes/mypubs/liu-submodfeature2013-icassp.pdf |date=2015-10-17 }}</ref><ref>Zheng et al., [http://papers.nips.cc/paper/5565-deep-convolutional-neural-network-for-image-deconvolution Submodular Attribute Selection for Action Recognition in Video] {{Webarchive|url=https://web.archive.org/web/20151118001059/http://papers.nips.cc/paper/5565-deep-convolutional-neural-network-for-image-deconvolution |date=2015-11-18 }}</ref> * Local learning based feature selection.<ref>{{cite journal | last1 = Sun | first1 = Y. | last2 = Todorovic | first2 = S. | last3 = Goodison | first3 = S. | year = 2010 |title=Local-Learning-Based Feature Selection for High-Dimensional Data Analysis | journal = [[IEEE Transactions on Pattern Analysis and Machine Intelligence]] | volume = 32 | issue = 9| pages = 1610β1626 | doi = 10.1109/tpami.2009.190 | pmc = 3445441 | pmid = 20634556 }}</ref> Compared with traditional methods, it does not involve any heuristic search, can easily handle multi-class problems, and works for both linear and nonlinear problems. It is also supported by a strong theoretical foundation. Numeric experiments showed that the method can achieve a close-to-optimal solution even when data contains >1M irrelevant features. * Recommender system based on feature selection.<ref>D.H. Wang, Y.C. Liang, D.Xu, X.Y. Feng, R.C. Guan(2018), "[https://www.sciencedirect.com/science/article/pii/S0950705118302107 A content-based recommender system for computer science publications]", ''[[Knowledge-Based Systems]]'', 157: 1-9</ref> The feature selection methods are introduced into recommender system research.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)