Regularized Semi-Supervised Multi-Label Learning
-
-
Abstract
Multi-label learning is proposed to deal with examples which are associating with multiple class labels simultaneously. Previous multi-label studies usually assume that large amounts of labeled training examples are available to obtain good performance. However, in many real world applications, labeled examples are few and amounts of unlabeled examples are readily available. In order to exploit the abundant unlabeled examples to help improve the generalization performance, we propose a novel regularized inductive semi-supervised multi-label method named MASS. Specifically, aside from minimizing the empirical risk, MASS employs two regularizers to constrain the final decision function. One is to characterize the classifier’s complexity with consideration of label relatedness, and the other requires that similar examples share with similar structural multi-label outputs. This leads to a large scale convex optimization problem, and an efficient alternating optimization algorithm is provided to achieve its global optimal solution in super-linear convergence rate due to the strong convexity of the objective function. Comprehensive experimental results on two real-world data sets, i.e., webpage categorization and gene functional analysis with varied numbers of labeled examples, demonstrate the effectiveness of the proposal.
-
-