Advanced Search
    Huang Yiwang, Huang Yuxin, Liu Sheng. A Lightweight Noise Label Learning Method Based on Online Distillation[J]. Journal of Computer Research and Development. DOI: 10.7544/issn1000-1239.202330382
    Citation: Huang Yiwang, Huang Yuxin, Liu Sheng. A Lightweight Noise Label Learning Method Based on Online Distillation[J]. Journal of Computer Research and Development. DOI: 10.7544/issn1000-1239.202330382

    A Lightweight Noise Label Learning Method Based on Online Distillation

    • Training deep learning models with noisy data containing lossy labels is a hot research topic in machine learning. Studies have shown that deep learning model training is susceptible to overfitting due to noisy data. Recently, a method combining meta-learning and label correction can make the model better adapt to the noisy data to mitigate the overfitting phenomenon. However, this meta-label correction method relies on the model’s performance, and the lightweight model does not have good generalization performance under noisy data. To address this problem, we propose a knowledge distillation-based meta-label correction learning method (KDMLC), which treats the meta-label correction model (MLC) composed of a deep neural network and a multilayer perceptron as a teacher model to correct the noise labels and guide the training of the lightweight model, and at the same time, adopts a two-layer optimization strategy to train and enhance the generalization ability of the teacher model, so as to generate a higher-quality pseudo-labels for training the lightweight model. The experiments show that KDMLC improves the test accuracy by 5.50% compared with MLC method at high noise level; meanwhile, using Cutout data enhancement on the CIFAR10 dataset, KDMLC improves the test accuracy by 9.11% compared with MLC at high noise level, and the experiments on the real noisy dataset, Clothing1M, also show that the KDMLC method outperforms the others, verifying that the KDMLC method is better than the other methods. , which verifies the feasibility and validity of the KDMLC method.
    • loading

    Catalog

      Turn off MathJax
      Article Contents

      /

      DownLoad:  Full-Size Img  PowerPoint
      Return
      Return