Generalization Error Bound for the Multi-Class Classification Algorithm Based on the Analytical Center of Version Space
-
-
Abstract
Analytical center machine, based on the analytical center of version space, outperforms support vector machine, especially when the version space is elongated or asymmetric. While analytical center machine for binary classification is well understood, little is known about corresponding multi-class classification. Multi-class classification is a significant challenge theoretically and practically in the field of machine learning. The current multi-class classification method, one versus all, needs constructing classifiers repeatedly to separate a single class from all the others, which leads to daunting computation and low efficiency of classification. Though multi-class support vector machine corresponds to a simple quadratic optimization, it is not very effective when the version space is asymmetric or elongated. Thus, the multi-class classification approach based on the analytical center of version space, which corresponds to a simple quadratic constrained linear optimization, is proposed to address the above problems. At the same time, in order to validate its generalization performance theoretically, its generalization error upper bound is formulated and proved. Experiments on wine recognition and glass identification dataset show that the multi-class classification approach based on the analytical center of version space outperforms the multi-class support vector machine in generalization error.
-
-