• 中国精品科技期刊
  • CCF推荐A类中文期刊
  • 计算领域高质量科技期刊T1类
Advanced Search
Han Yanjun and Wang Jue. A Bi-Sparse Relational Learning Algorithm Based on Multiple Kernel Learning[J]. Journal of Computer Research and Development, 2010, 47(8): 1400-1406.
Citation: Han Yanjun and Wang Jue. A Bi-Sparse Relational Learning Algorithm Based on Multiple Kernel Learning[J]. Journal of Computer Research and Development, 2010, 47(8): 1400-1406.

A Bi-Sparse Relational Learning Algorithm Based on Multiple Kernel Learning

More Information
  • Published Date: August 14, 2010
  • Relational learning is becoming a focus in machine learning research. In relational learning, samples cannot be represented as vectors in R\+n. This characteristic distinguishes it from other machinelearning tasks in that relational learning cannot utilize the geometric structure in R\+n and thus is much harder to solve. In this paper a multiple kernel learning approach for relational learning is proposed. First of all, it is proved that for multiple kernel learning with the kernels induced by logical rules, it suffices to use the linear kernel. Based on this, through iteratively constructing rules by a modified FOIL algorithm and performing the corresponding multiple kernel optimization, the proposed approach realizes an additive model on the feature space induced by the obtained rules. The algorithm is characterized by its “bi-sparsity”, i.e., support rules and support vectors are obtained simultaneously. Moreover, it is proved that the multiple kernel learning in the feature space induced by rules is equivalent to squared \-1 SVM. The proposed algorithm is evaluated on six real world datasets from bioinformatics and chemoinformatics. Experimental results demonstrate that the approach has better prediction accuracy than previous approaches. Meanwhile, the output classifier has a straightforward interpretation and relies on a smaller number of rules.
  • Related Articles

    [1]Liu Yanfang, Li Wenbin, Gao Yang. Adaptive Neighborhood Embedding Based Unsupervised Feature Selection[J]. Journal of Computer Research and Development, 2020, 57(8): 1639-1649. DOI: 10.7544/issn1000-1239.2020.20200219
    [2]Guo Yaqing, Wang Wenjian, Su Meihong. An Adaptive Regression Feature Selection Method for Datasets with Outliers[J]. Journal of Computer Research and Development, 2019, 56(8): 1695-1707. DOI: 10.7544/issn1000-1239.2019.20190313
    [3]Xu Hang, Zhang Kai, Wang Wenjian. A Feature Selection Method for Small Samples[J]. Journal of Computer Research and Development, 2018, 55(10): 2321-2330. DOI: 10.7544/issn1000-1239.2018.20170748
    [4]Wang Ling, Meng Jianyao. Dynamic Fuzzy Features Selection Based on Variable Weight[J]. Journal of Computer Research and Development, 2018, 55(5): 893-907. DOI: 10.7544/issn1000-1239.2018.20170503
    [5]Xu Junling, Zhou Yuming, Chen Lin, Xu Baowen. An Unsupervised Feature Selection Approach Based on Mutual Information[J]. Journal of Computer Research and Development, 2012, 49(2): 372-382.
    [6]Liu He, Zhang Xianghong, Liu Dayou, Li Yanjun, Yin Lijun. A Feature Selection Method Based on Maximal Marginal Relevance[J]. Journal of Computer Research and Development, 2012, 49(2): 354-360.
    [7]Ye Qiaolin, Zhao Chunxia, and Chen Xiaobo. A Feature Selection Method for TWSVM via a Regularization Technique[J]. Journal of Computer Research and Development, 2011, 48(6): 1029-1037.
    [8]Wang Bo, Huang Jiuming, Jia Yan, and Yang Shuqiang. Research on a Common Feature Selection Method for Multiple Supervised Models[J]. Journal of Computer Research and Development, 2010, 47(9): 1548-1557.
    [9]Jing Hongfang, Wang Bin, YangYahui, Xu Yan. Category Distribution-Based Feature Selection Framework[J]. Journal of Computer Research and Development, 2009, 46(9): 1586-1593.
    [10]Liu Tao, Wu Gongyi, Chen Zheng. An Effective Unsupervised Feature Selection Method for Text Clustering[J]. Journal of Computer Research and Development, 2005, 42(3).

Catalog

    Article views (724) PDF downloads (503) Cited by()

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return