高级检索
    黄贻望, 黄雨鑫, 刘声. 一种基于在线蒸馏的轻量化噪声标签学习方法[J]. 计算机研究与发展. DOI: 10.7544/issn1000-1239.202330382
    引用本文: 黄贻望, 黄雨鑫, 刘声. 一种基于在线蒸馏的轻量化噪声标签学习方法[J]. 计算机研究与发展. DOI: 10.7544/issn1000-1239.202330382
    Huang Yiwang, Huang Yuxin, Liu Sheng. A Lightweight Noise Label Learning Method Based on Online Distillation[J]. Journal of Computer Research and Development. DOI: 10.7544/issn1000-1239.202330382
    Citation: Huang Yiwang, Huang Yuxin, Liu Sheng. A Lightweight Noise Label Learning Method Based on Online Distillation[J]. Journal of Computer Research and Development. DOI: 10.7544/issn1000-1239.202330382

    一种基于在线蒸馏的轻量化噪声标签学习方法

    A Lightweight Noise Label Learning Method Based on Online Distillation

    • 摘要: 利用含有有损标签的噪声数据来训练深度学习模型是机器学习中的研究热点. 研究表明深度学习模型训练易受噪声数据的影响而产生过拟合现象. 最近,一种将元学习与标签校正相结合的方法能够使模型更好的适应噪声数据以减缓过拟合现象,然而这种元标签校正方法依赖于模型的性能,同时轻量化模型在噪声数据下不具备良好的泛化性能. 针对这一问题,提出一种采用知识蒸馏的轻量化噪声元标签校正学习方法(knowledge distillation-based meta-label correction learning, KDMLC),该方法将深度神经网络与多层感知机构成的元标签校正(meta label correction, MLC)模型视为教师模型,对噪声标签进行校正并指导轻量化模型进行训练,同时采用双层优化策略训练并增强教师模型的泛化能力,从而生成更高质量的伪标签用于训练轻量化模型. 实验表明,KDMLC在高噪声水平下对比MLC方法测试精度提高了5.50%;同时对CIFAR10数据集使用Cutout数据增强,KDMLC在高噪声水平下对比MLC测试精度提升了9.11%,而在真实噪声数据集Clothing1M上的实验KDMLC方法也优于其他方法,验证了KDMLC方法的可行性和有效性.

       

      Abstract: Training deep learning models with noisy data containing lossy labels is a hot research topic in machine learning. Studies have shown that deep learning model training is susceptible to overfitting due to noisy data. Recently, a method combining meta-learning and label correction can make the model better adapt to the noisy data to mitigate the overfitting phenomenon. However, this meta-label correction method relies on the model’s performance, and the lightweight model does not have good generalization performance under noisy data. To address this problem, we propose a knowledge distillation-based meta-label correction learning method (KDMLC), which treats the meta-label correction model (MLC) composed of a deep neural network and a multilayer perceptron as a teacher model to correct the noise labels and guide the training of the lightweight model, and at the same time, adopts a two-layer optimization strategy to train and enhance the generalization ability of the teacher model, so as to generate a higher-quality pseudo-labels for training the lightweight model. The experiments show that KDMLC improves the test accuracy by 5.50% compared with MLC method at high noise level; meanwhile, using Cutout data enhancement on the CIFAR10 dataset, KDMLC improves the test accuracy by 9.11% compared with MLC at high noise level, and the experiments on the real noisy dataset, Clothing1M, also show that the KDMLC method outperforms the others, verifying that the KDMLC method is better than the other methods. , which verifies the feasibility and validity of the KDMLC method.

       

    /

    返回文章
    返回