Improving SVM performance in multi-label domains: Threshold adjustment

Peerapon Vateekul, Sareewan Dendamrongvit, Miroslav Kubat

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

In "multi-label domains," where the same example can simultaneously belong to two or more classes, it is customary to induce a separate binary classifier for each class, and then use them all in parallel. As a result, some of these classifiers are induced from imbalanced training sets where one class outnumbers the other - a circumstance known to hurt some machine learning paradigms. In the case of Support Vector Machines (SVM), this suboptimal behavior is explained by the fact that SVM seeks to minimize error rate, a criterion that is in domains of this type misleading. This is why several research groups have studied mechanisms to readjust the bias of SVM's hyperplane. The best of these achieves very good classification performance at the price of impractically high computational costs. We propose here an improvement where these cost are reduced to a small fraction without significantly impairing classification.

Original languageEnglish (US)
Article number1250038
JournalInternational Journal on Artificial Intelligence Tools
Volume22
Issue number1
DOIs
StatePublished - Mar 5 2013

Keywords

  • multi-label classification
  • Support vector machines
  • threshold adjustment

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Improving SVM performance in multi-label domains: Threshold adjustment'. Together they form a unique fingerprint.

Cite this