ISSN 1000-1239 CN 11-1777/TP

Journal of Computer Research and Development ›› 2018, Vol. 55 ›› Issue (8): 1751-1759.doi: 10.7544/issn1000-1239.2018.20180362

Special Issue: 2018数据挖掘前沿进展专题

Previous Articles     Next Articles

Exploiting Label Relationships in Multi-Label Classification with Neural Networks

Song Pan, Jing Liping   

  1. (Beijing Key Laboratory of Traffic Data Analysis and Mining (Beijing Jiaotong University), Beijing 100044)
  • Online:2018-08-01

Abstract: Multi-label learning is critical in many real world application domains including text classification, image annotation, video semantic annotation, gene function analysis, etc. Recently, multi-label learning has attracted intensive attention and generated a hot research topic in machine learning community. However, the existing methods do not adequately address two key challenges: exploiting correlations between labels and making up for the lack of labeled data or even missing labels. A NN_AD_Omega model via neural network for exploring labels dependencies is proposed to handle these two challenges efficiently. NN_AD_Omega model introduces an Omega matrix in the top layer of the neural network to characterize the labels dependencies. As a good by-product, the learnt label correlations have ability to improve prediction performance when the instances’ partial labels are missing because they can capture the intrinsic structure among data. In order to solve the model efficiently, we use the mini-batch gradient descent (Mini-batch-GD) method to solve the optimization problem, meanwhile, the AdaGrad technique is adopted to adaptively search the learning rate. Experiments on four real multi-label datasets demonstrate that the proposed method can exploit the label correlations and handle the missing label data, and obtain promising and better label prediction results than the state-of-the-art neural network based multi-label learning methods.

Key words: neural network, label relationships, multi-label learning, classification, missing labels

CLC Number: