Multi-label learning is critical in many real world application domains including text classification, image annotation, video semantic annotation, gene function analysis, etc. Recently, multi-label learning has attracted intensive attention and generated a hot research topic in machine learning community. However, the existing methods do not adequately address two key challenges: exploiting correlations between labels and making up for the lack of labeled data or even missing labels. A NN_AD_Omega model via neural network for exploring labels dependencies is proposed to handle these two challenges efficiently. NN_AD_Omega model introduces an Omega matrix in the top layer of the neural network to characterize the labels dependencies. As a good by-product, the learnt label correlations have ability to improve prediction performance when the instances’ partial labels are missing because they can capture the intrinsic structure among data. In order to solve the model efficiently, we use the mini-batch gradient descent (Mini-batch-GD) method to solve the optimization problem, meanwhile, the AdaGrad technique is adopted to adaptively search the learning rate. Experiments on four real multi-label datasets demonstrate that the proposed method can exploit the label correlations and handle the missing label data, and obtain promising and better label prediction results than the state-of-the-art neural network based multi-label learning methods.