高级检索
    花小朋, 丁世飞. 局部保持对支持向量机[J]. 计算机研究与发展, 2014, 51(3): 590-597.
    引用本文: 花小朋, 丁世飞. 局部保持对支持向量机[J]. 计算机研究与发展, 2014, 51(3): 590-597.
    Hua Xiaopeng, Ding Shifei. Locality Preserving Twin Support Vector Machines[J]. Journal of Computer Research and Development, 2014, 51(3): 590-597.
    Citation: Hua Xiaopeng, Ding Shifei. Locality Preserving Twin Support Vector Machines[J]. Journal of Computer Research and Development, 2014, 51(3): 590-597.

    局部保持对支持向量机

    Locality Preserving Twin Support Vector Machines

    • 摘要: 多面支持向量机(multiple surface support vector machine, MSSVM)分类方法作为传统支持向量机(support vector machine, SVM)的拓展在模式识别领域成为新的研究热点之一,然而已有的MSSVM方法并没有充分考虑到训练样本之间的局部几何结构以及所蕴含的判别信息.因此将保局投影(locality preserving projections, LPP)的基本思想引入到MSSVM中,提出局部保持对支持向量机(locality preserving twin support vector machine, LPTSVM).LPTSVM方法不但继承了MSSVM方法具有的异或(XOR)问题处理能力,而且充分考虑样本间的局部几何结构,体现样本间所蕴含的局部判别信息,从而在一定程度上提高了分类精度.主成分分析(principal component analysis, PCA)方法克服了LPTSVM奇异性问题,保证了LPTSVM方法的有效性.非线性情况下,通过经验核映射方法构造了非线性LPTSVM.在人造数据集和真实数据集上的测试表明LPTSVM方法具有较好的泛化性能.

       

      Abstract: For classification problems, support vector machine (SVM) achieves state-of-the-art performance in many real applications. A guarantee of its performance superiority is from the maximization of between-class margin. However, SVM solution does not take into consideration the class distribution and may result in a non-robust solution. Recently, multiple surface support vector machine (MSSVM), as an extension of traditional SVM, has been one of the hot research topics in the field of pattern recognition. Unfortunately, many known MSSVM classification algorithms have not considered the underlying local geometric structure and the descriminant information fully. Therefore, a locality preserving twin support vector machine (LPTSVM) is presented in this paper by introducing the basic theories of the locality preserving projections (LPP) into the MSSVM. This method inherits the characteristic of MSSVM for dealing with the XOR problem, fully considers the local geometric structure between samples and shows the local underlying discriminant information. The linear case, the small sample size case and the nonlinear case of the LPTSVM are discussed in this paper. The LPTSVM optimization problem in the small sample size case is solved by using dimensionality reduction through principal component analysis (PCA) and the problem in the nonlinear case is transformed into an equivalent linear LPTSVM problem under empirical kernel mapping (EKM) method. Experimental results on the artificial and real datasets indicate the effectiveness of the LPTSVM method.

       

    /

    返回文章
    返回