Advanced Search
    Zhang Minling. An Improved Multi-Label Lazy Learning Approach[J]. Journal of Computer Research and Development, 2012, 49(11): 2271-2282.
    Citation: Zhang Minling. An Improved Multi-Label Lazy Learning Approach[J]. Journal of Computer Research and Development, 2012, 49(11): 2271-2282.

    An Improved Multi-Label Lazy Learning Approach

    • Multi-label learning deals with the problem where each example is represented by a single instance while associated with multiple class labels. A number of multi-label learning approaches have been proposed recently, among which multi-label lazy learning methods have shown to yield good generalization abilities. Existing multi-label learning algorithm based on lazy learning techniques does not address the correlations between different labels of each example, such that the performance of the algorithm could be negatively influenced. In this paper, an improved multi-label lazy learning approach named IMLLA is proposed. Given a test example, IMLLA works by firstly identifying its neighboring instances in the training set for each possible class. After that, a label counting vector is generated from those neighboring instances and fed to the trained linear classifiers. In this way, information embedded in other classes is involved in the process of predicting the label of each class, so that the inter-label relationships of each example are appropriately addressed. Experiments are conducted on several synthetic data sets and two benchmark real-world data sets regarding natural scene classification and yeast gene functional analysis. Experimental results show that the performance of IMLLA is superior to other well-established multi-label learning algorithms, including one of the state-of-the-art lazy-style multi-label leaner.
    • loading

    Catalog

      Turn off MathJax
      Article Contents

      /

      DownLoad:  Full-Size Img  PowerPoint
      Return
      Return