Abstract:
Proximal support vector machine via generalized eigenvalues (GEPSVM) casts away the parallelism condition on the canonical planes of the traditional support vector machines (SVM) and analytically seeks two hyperplanes such that each plane is close to the samples of its class and meanwhile far away from the samples of the other classes. Compared with the SVM, GEPSVM does not need quadratic programming and can gain comparable classification performance to SVM. Despite these advantages, GEPSVM is a binary classifier and can not separate multi-class datasets directly. Moreover, it is hard to theoretically set the regularization parameter in it and the generalized eigen-equation problem may be ill-conditioned. In this paper, a novel method, proximal SVM based on prototypal multi-classification hyperplanes (MHPSVM) is proposed, which can directly obtain multi-prototypal hyperplanes for multiple-class classification. Finally, experimental results on both artificial and benchmark datasets show that the classification performance of MHPSVM can be significantly higher than that of GEPSVM, especially in multi-class classification.