Advanced Search
    Li Feng, Miao Duoqian, Zhang Zhifei, Zhang Wei. Mutual Information Based Granular Feature Weighted k-Nearest Neighbors Algorithm for Multi-Label Learning[J]. Journal of Computer Research and Development, 2017, 54(5): 1024-1035. DOI: 10.7544/issn1000-1239.2017.20160351
    Citation: Li Feng, Miao Duoqian, Zhang Zhifei, Zhang Wei. Mutual Information Based Granular Feature Weighted k-Nearest Neighbors Algorithm for Multi-Label Learning[J]. Journal of Computer Research and Development, 2017, 54(5): 1024-1035. DOI: 10.7544/issn1000-1239.2017.20160351

    Mutual Information Based Granular Feature Weighted k-Nearest Neighbors Algorithm for Multi-Label Learning

    • All features contribute equally to compute the distance between any pair of instances when finding the nearest neighbors in traditional kNN based multi-label learning algorithms. Furthermore, most of these algorithms transform the multi-label problem into a set of single-label binary problems, which ignore the label correlation. The performance of multi-label learning algorithm greatly depends on the input features, and different features contain different knowledge about the label classification, so the features should be given different importance. Mutual information is one of the widely used measures of dependency of variables, and can evaluate the knowledge contained in the feature about the label classification. Therefore, we propose a granular feature weighted k-nearest neighbors algorithm for multi-label learning based on mutual information, which gives the feature weights according to the knowledge contained in the feature. The proposed algorithm firstly granulates the label space into several label information granules to avoid the problem of label combination explosion problem, and then calculates feature weights for each label information granule, which takes label combinations into consideration to merge label correlations into feature weights. The experimental results show that the proposed algorithm can achieve better performance than other common multi-label learning algorithms.
    • loading

    Catalog

      Turn off MathJax
      Article Contents

      /

      DownLoad:  Full-Size Img  PowerPoint
      Return
      Return