ISSN 1000-1239 CN 11-1777/TP

Journal of Computer Research and Development ›› 2016, Vol. 53 ›› Issue (9): 1964-1970.doi: 10.7544/issn1000-1239.2016.20150436

Previous Articles     Next Articles

Cost-Sensitive Large Margin Distribution Machine

Zhou Yuhang, Zhou Zhihua   

  1. (National Key Laboratory for Novel Software Technology (Nanjing University), Nanjing 210023) (Collaborative Innovation Center of Novel Software Technology and Industrialization, Nanjing 210023)
  • Online:2016-09-01

Abstract: In many real world applications, different types of misclassification often suffer from different losses, which can be described by costs. Cost-sensitive learning tries to minimize the total cost rather than minimize the error rate. During the past few years, many efforts have been devoted to cost-sensitive learning. The basic strategy for cost-sensitive learning is rescaling, which tries to rebalance the classes so that the influence of different classes is proportional to their cost, and it has been realized in different ways such as assigning different weights to training examples, resampling the training examples, moving the decision thresholds, etc. Moreover, researchers integrated cost-sensitivity into some specific methods, and proposed alternative embedded approaches such as CS-SVM. In this paper, we propose the CS-LDM (cost-sensitive large margin distribution machine) approach to tackle cost-sensitive learning problems. Rather than maximize the minimum margin like traditional support vector machines, CS-LDM tries to optimize the margin distribution and efficiently solve the optimization objective by the dual coordinate descent method, to achieve better generalization performance. Experiments on a series of data sets and cost settings exhibit the impressive performance of CS-LDM; in particular, CS-LDM is able to reduce 24% more average total cost than CS-SVM.

Key words: cost-sensitive learning, margin distribution, support vector machine (SVM), representer theorem, dual coordinate descent method

CLC Number: