Abstract:
In classification by support vector machines with the Gaussian kernel, the kernel width defines the generalization scale in the pattern space or in the feature space. However, the Gaussian kernel with constant width is not well adaptive everywhere in the pattern space since the patterns are not evenly distributed. That is, the over-fitting learning will appear in the dense areas and otherwise the under-fitting learning in the sparse areas. To reduce such local risks, a secondary kernel with global character is introduced for the Gaussian kernel. Here the Gaussian kernel is regarded as the primary kernel. The constructed hybrid kernel is called the primary-secondary kernel (PSK). The positive definiteness of PSK with given constraints is proved by virtue of the power series. For support vector machines with PSK, the two-stage model selection based on genetic algorithms is proposed to tune the model parameters. That is, the algorithms firstly tune the model parameters with Gaussian kernel. Then the model parameters with the Gaussian kernel keep unchanged and the model parameters with the secondary kernel are further tuned. The two-stage model selection algorithms aim to overcome the problem of the optimization tendency embodied in the optimization algorithms. For the support vector machines with multiple parameters, the optimization tendency often causes the failure of the model selection. Finally, the experiments demonstrate that PSK performs better than the Gaussian kernel and also validate the efficiency of the proposed model selection algorithms.