Three-Step Bayesian Combination of SVM on Regularization Path
-
Graphical Abstract
-
Abstract
Model combination integrates and leverages multiple models in the hypothesis space to improve the reliability and generalization performance of learning systems. In this paper, a novel three-step method for model combination of support vector machines (SVM) based on regularization path is proposed. The Lh-risk consistency for model combination of SVM is defined and proved, which gives the mathematical foundation of the proposed method. Traditionally, model set for model combination of SVM is constructed by data sampling methods. In our method, the model set is constructed with SVM regularization path, which is trained by using the same original training set. First, the initial model set is obtained according to the piecewise linearity of SVM regularization path. Then, the average of GACV is applied to exclude models with poor performance and prune the initial model set. The pruning policy improves not only the computational efficiency of model combination but the generalization performance. In the testing or predicting phase, the input-sensitive combination model set is determined with the minimal neighborhood method, and Bayesian combination is performed. Compared with traditional model combination methods of SVM, the proposed method need not to tune the regularization parameters for each individual SVM model, thus the training procedure can be simplified considerably. Experimental results demonstrate the effectiveness and efficiency of the three-step Bayesian combination of SVM on regularization path.
-
-