Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Support vector machine
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Multiclass SVM === Multiclass SVM aims to assign labels to instances by using support vector machines, where the labels are drawn from a finite set of several elements. The dominant approach for doing so is to reduce the single [[multiclass problem]] into multiple [[binary classification]] problems.<ref name="duan2005">{{Cite book |last1=Duan |first1=Kai-Bo |last2=Keerthi |first2=S. Sathiya |chapter=Which Is the Best Multiclass SVM Method? An Empirical Study |doi=10.1007/11494683_28 |title=Multiple Classifier Systems |series=[[Lecture Notes in Computer Science|LNCS]] |volume=3541 |pages=278–285 |year=2005 |isbn=978-3-540-26306-7 |citeseerx=10.1.1.110.6789 |chapter-url=https://www.cs.iastate.edu/~honavar/multiclass-svm2.pdf |access-date=2019-07-18 |archive-date=2013-05-03 |archive-url=https://web.archive.org/web/20130503183745/http://www.cs.iastate.edu/~honavar/multiclass-svm2.pdf |url-status=dead }}</ref> Common methods for such reduction include:<ref name="duan2005" /><ref name="hsu2002">{{cite journal |title=A Comparison of Methods for Multiclass Support Vector Machines |year=2002 |journal=IEEE Transactions on Neural Networks |last1=Hsu |first1=Chih-Wei |last2=Lin |first2=Chih-Jen |volume=13 |issue=2 |pages=415–25 |name-list-style=amp |url=http://www.cs.iastate.edu/~honavar/multiclass-svm.pdf |pmid=18244442 |doi=10.1109/72.991427 |access-date=2018-01-08 |archive-date=2013-05-03 |archive-url=https://web.archive.org/web/20130503183743/http://www.cs.iastate.edu/~honavar/multiclass-svm.pdf |url-status=dead }}</ref> * Building binary classifiers that distinguish between one of the labels and the rest (''one-versus-all'') or between every pair of classes (''one-versus-one''). Classification of new instances for the one-versus-all case is done by a winner-takes-all strategy, in which the classifier with the highest-output function assigns the class (it is important that the output functions be calibrated to produce comparable scores). For the one-versus-one approach, classification is done by a max-wins voting strategy, in which every classifier assigns the instance to one of the two classes, then the vote for the assigned class is increased by one vote, and finally the class with the most votes determines the instance classification. * [[Directed acyclic graph]] SVM (DAGSVM)<ref>{{cite book |chapter=Large margin DAGs for multiclass classification |editor1=Solla, Sara A.|editor1-link=Sara Solla |editor2=Leen, Todd K. |editor3=Müller, Klaus-Robert |editor3-link=Klaus-Robert Müller |title=Advances in Neural Information Processing Systems |publisher=MIT Press |year=2000 |chapter-url=http://www.wisdom.weizmann.ac.il/~bagon/CVspring07/files/DAGSVM.pdf |pages=547–553 |last1=Platt |first1=John |author-link2=Nello Cristianini |last2=Cristianini |first2=Nello |author-link3=John Shawe-Taylor |last3=Shawe-Taylor |first3=John |url-status=live |archive-url=https://web.archive.org/web/20120616221540/http://www.wisdom.weizmann.ac.il/~bagon/CVspring07/files/DAGSVM.pdf |archive-date=2012-06-16 }}</ref> * [[Error correcting code|Error-correcting output codes]]<ref>{{cite journal |title=Solving Multiclass Learning Problems via Error-Correcting Output Codes |journal=Journal of Artificial Intelligence Research |year=1995 |url=http://www.jair.org/media/105/live-105-1426-jair.pdf |pages=263–286 |last1=Dietterich |first1=Thomas G. |last2=Bakiri |first2=Ghulum |bibcode=1995cs........1101D |arxiv=cs/9501101 |volume=2 |url-status=live |archive-url=https://web.archive.org/web/20130509061344/http://www.jair.org/media/105/live-105-1426-jair.pdf |archive-date=2013-05-09 |doi=10.1613/jair.105 |s2cid=47109072 }}</ref> Crammer and Singer proposed a multiclass SVM method which casts the [[multiclass classification]] problem into a single optimization problem, rather than decomposing it into multiple binary classification problems.<ref>{{cite journal |title=On the Algorithmic Implementation of Multiclass Kernel-based Vector Machines |year=2001 |url=http://jmlr.csail.mit.edu/papers/volume2/crammer01a/crammer01a.pdf |journal=Journal of Machine Learning Research |volume=2 |pages=265–292 |last1=Crammer |first1=Koby |last2=Singer |first2=Yoram |name-list-style=amp |url-status=live |archive-url=https://web.archive.org/web/20150829102651/http://jmlr.csail.mit.edu/papers/volume2/crammer01a/crammer01a.pdf |archive-date=2015-08-29 }}</ref> See also Lee, Lin and Wahba<ref>{{cite journal |title=Multicategory Support Vector Machines |year=2001 |journal=Computing Science and Statistics |volume=33 |url=http://www.interfacesymposia.org/I01/I2001Proceedings/YLee/YLee.pdf |last1=Lee |first1=Yoonkyung |last2=Lin |first2=Yi |last3=Wahba |first3=Grace |name-list-style=amp |url-status=usurped |archive-url=https://web.archive.org/web/20130617093314/http://www.interfacesymposia.org/I01/I2001Proceedings/YLee/YLee.pdf |archive-date=2013-06-17 }}</ref><ref>{{Cite journal |doi=10.1198/016214504000000098 |title=Multicategory Support Vector Machines |journal=Journal of the American Statistical Association |volume=99 |issue=465 |pages=67–81 |year=2004 |last1=Lee |first1=Yoonkyung |last2=Lin |first2=Yi |last3=Wahba |first3=Grace |citeseerx=10.1.1.22.1879 |s2cid=7066611 }}</ref> and Van den Burg and Groenen.<ref>{{Cite journal |title=GenSVM: A Generalized Multiclass Support Vector Machine |year=2016|url=http://jmlr.org/papers/volume17/14-526/14-526.pdf |journal=Journal of Machine Learning Research |volume=17|issue=224|pages=1–42|last1=Van den Burg|first1=Gerrit J. J. |last2=Groenen |first2=Patrick J. F.|name-list-style=amp}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)