• 中国精品科技期刊
  • CCF推荐A类中文期刊
  • 计算领域高质量科技期刊T1类
Advanced Search
Wen Yimin, Yuan Zhe, Yu Hang. A New Semi-Supervised Inductive Transfer Learning Framework: Co-Transfer[J]. Journal of Computer Research and Development, 2023, 60(7): 1603-1614. DOI: 10.7544/issn1000-1239.202220232
Citation: Wen Yimin, Yuan Zhe, Yu Hang. A New Semi-Supervised Inductive Transfer Learning Framework: Co-Transfer[J]. Journal of Computer Research and Development, 2023, 60(7): 1603-1614. DOI: 10.7544/issn1000-1239.202220232

A New Semi-Supervised Inductive Transfer Learning Framework: Co-Transfer

Funds: The work was supported by the National Natural Science Foundation of China (61866007), the Guangxi Natural Science Foundation (2018GXNSFDA138006), the Key Research and Development Program of Guangxi (桂科AB21220023), and the Program of Guangxi Key Laboratory of Image and Graphic Intelligent Processing (GIIP2005, GIIP201505)
More Information
  • Author Bio:

    Wen Yimin: born in 1969. PhD,professor,PhD supervisor. Distinguished member of CCF. His main research interests include machine learning, recommendation systems, and big data analysis

    Yuan Zhe: born in 1995. Master candidate. His main research interests include machine learning and data mining

    Yu Hang: born in 1991. PhD, professor, PhD supervisor. Member of IEEE and CCF. His main research interests include online machine learning, knowledge graph and multi-agent system

  • Received Date: March 17, 2022
  • Revised Date: July 12, 2022
  • Available Online: February 28, 2023
  • In many practical data mining scenarios, such as network intrusion detection, Twitter spam detection, and computer-aided diagnosis, source domain that is different but related to a target domain is very common. Generally, a large amount of unlabeled data is available in both source domain and target domain, but labeling each of them is difficult, expensive, time-consuming, and sometime unnecessary. Therefore, it is very important and worthwhile to fully explore the labeled and unlabeled data in source domain and target domain to handle classification tasks in target domain. To leverage transfer learning and semi-supervised learning, we propose a new inductive transfer learning framework named Co-Transfer. Co-Transfer first generates three TrAdaBoost classifiers for transfer learning from the original source domain to the original target domain, and meanwhile another three TrAdaBoost classifiers are generated for transfer learning from the original target domain to the original source domain by bootstrapping samples from the original labeled data. In each round of Co-Transfer, each group of TrAdaBoost classifiers is refined by using the carefully labeled data, one part of which is the original labeled samples, the second part is the samples labeled by one group of TrAdaBoost classifiers, and the other samples are labeled by another group of TrAdaBoost classifiers. Finally, the group of TrAdaBoost classifiers learned to transfer from the original source domain to the original target domain to produce the final hypothesis. Experimental results on UCI and text classification task datasets illustrate that Co-Transfer can significantly improve generalization performance by exploring labeled and unlabeled data across different tasks.

  • [1]
    Pan S, Yang Qiang. A survey on transfer learning[J]. IEEE Transactions on Knowledge and Data Engineering, 2010, 22(10): 1345−1359 doi: 10.1109/TKDE.2009.191
    [2]
    Zhuang Fuzhen, Qi Zhiyuan, Duan Keyu, et al. A comprehensive survey on transfer learning[J]. Proceedings of the IEEE, 2020, 109(1): 43−76
    [3]
    Liu Xiaobo, Zhang H, Cai Zhihua, et al. A Tri-training based transfer learning algorithm[C]//Proc of the 24th Int Conf on Tools with Artificial Intelligence. Piscataway. NJ: IEEE, 2012: 698−703
    [4]
    Tang Yejun, Wu Bin, Peng Liangrui, et al. Semi-supervised transfer learning for convolution neural network based Chinese character recognition[C]//Proc of the 14th IAPR Int Conf on Document Analysis and Recognition (ICDAR). Piscataway, NJ: IEEE, 2017: 441−447
    [5]
    Abuduweili A, Li Xingjian, Shi H, et al. Adaptive consistency regularization for semi-supervised transfer learning[C]//Proc of the 34th IEEE/CVF Conf on Computer Vision and Pattern Recognition. Piscataway, NJ: IEEE, 2021: 6923−6932
    [6]
    Wei Wei, Meng Deyu, Zhao Qian, et al. Semi-supervised transfer learning for image rain removal[C]//Proc of the 32nd IEEE/CVF Conf on Computer Vision and Pattern Recognition. Piscataway, NJ: IEEE, 2019: 3877−3886
    [7]
    Chebli A, Djebbar A, Marouani H. Semi-supervised learning for medical application: A survey[C]//Proc of the 9th Int Conf on Applied Smart Systems (ICASS). Piscataway, NJ: IEEE, 2018: 1−9
    [8]
    Mohanasundaram R, Malhotra A, Arun R, et al. Deep Learning and Semi-supervised and Transfer Learning Algorithms for Medical Imaging[M]// Deep Learning and Parallel Computing Environment for Bioengineering Systems. New York: Academic Press, 2019: 139−151
    [9]
    Liu Quande, Yu Lequan, Luo Luyang, et al. Semi-supervised medical image classification with relation-driven self-ensembling model[J]. IEEE Transactions on Medical Imaging, 2020, 39(11): 3429−3440 doi: 10.1109/TMI.2020.2995518
    [10]
    Liu Qinhua, Liao Xuejun, Li Hui, et al. Semi-supervised multitask learning[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008, 31(6): 1074−1086
    [11]
    Skolidis G, Sanguinetti G. Semisupervised multitask learning with Gaussian processes[J]. IEEE Transactions on Neural Networks and Learning Systems, 2013, 24(12): 2101−2112 doi: 10.1109/TNNLS.2013.2272403
    [12]
    Naji K, Ulas B. Semi-supervised multi-task learning for lung cancer diagnosis[C]//Proc of the 40th Annual Int Conf of the IEEE Engineering in Medicine and Biology Society. Piscataway, NJ: IEEE, 2018: 710−713
    [13]
    Chu Xu, Lin Yang, Wang Yasha, et al. Mlrda: A multi-task semi-supervised learning framework for drug-drug interaction prediction[C]//Proc of the 28th Int Joint Conf on Artificial Intelligence. New York: ACM, 2019: 4518−4524
    [14]
    Qi Qi, Wang Xiaolu, Sun Haifeng, et al. A novel multi-task learning framework for semi-supervised semantic parsing[J]. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 2020, 28: 2552−2560 doi: 10.1109/TASLP.2020.3018233
    [15]
    Van Engelen J, Hoos H. A survey on semi-supervised learning[J]. Machine Learning, 2020, 109(2): 373−440 doi: 10.1007/s10994-019-05855-6
    [16]
    许勐璠,李兴华,刘海,等. 基于半监督学习和信息增益率的入侵检测方案[J]. 计算机研究与发展,2017,54(10):2255−2267 doi: 10.7544/issn1000-1239.2017.20170456

    Xu Mengfan, Li Xinghua, Liu Hai, et al. An intrusion detection scheme based on semi-supervised learning and information gain ratio[J]. Journal of Computer Research and Development, 2017, 54(10): 2255−2267 (in Chinese) doi: 10.7544/issn1000-1239.2017.20170456
    [17]
    Dai Wenyuan, Yang Qiang, Xue Guirong, et al. Boosting for transfer learning[C]//Proc of the 24th Int Conf on Machine Learning (ICML '07). New York: ACM, 2007: 193–200
    [18]
    Zhou Zhihua, Li Ming. Tri-training: Exploiting unlabeled data using three classifiers[J]. IEEE Transactions on Knowledge and Data Engineering, 2005, 17(11): 1529−1541 doi: 10.1109/TKDE.2005.186
    [19]
    Bache K, Lichman M. UCI machine learning repository [DB/OL]. 2013[2022-06-08].https://archive.ics.uci.edu/ml/index.php
    [20]
    Scrucca L. A fast and efficient modal EM algorithm for Gaussian mixtures[J]. The ASA Data Science Journal:Statistical Analysis and Data Mining, 2021, 14(4): 305−314 doi: 10.1002/sam.11527
    [21]
    Li Ming, Zhou Zhihua. Improve computer-aided diagnosis with machine learning techniques using undiagnosed samples[J]. IEEE Transactions on Systems, Man, and Cybernetics-Part A:Systems and Humans, 2007, 37(6): 1088−1098 doi: 10.1109/TSMCA.2007.904745
    [22]
    Cervantes J, Garcia-Lamont F, Rodriguez-Mazahua L, et al. A comprehensive survey on support vector machine classification: Applications, challenges and trends[J]. Neurocomputing, 2020, 408: 189−215 doi: 10.1016/j.neucom.2019.10.118
    [23]
    Shimomura L, Oyamada R, Vieira M et al. A survey on graph-based methods for similarity searches in metric spaces[J]. Information Systems, 2021, 95: 101507 doi: 10.1016/j.is.2020.101507
    [24]
    Blum A, Mitchell T. Combining labeled and unlabeled data with co-training[C]//Proc of the 8th Annual Conf on Computational Learning Theory. New York: ACM, 1998: 92−100
    [25]
    Dasgupta S, Littman M, McAllester D. PAC generalization bounds for co-training[C]//Proc of the 14th Int Conf on Neural Information Processing Systems. Cambridge, MA: MIT Press, 2002: 375−382
    [26]
    Triguero I, García S, Herrera F. Self-labeled techniques for semi-supervised learning: Taxonomy, software and empirical study[J]. Knowledge and Information Systems, 2015, 42(2): 245−284 doi: 10.1007/s10115-013-0706-y
    [27]
    Kamishima T, Hamasaki M, Akaho S. TrBagg: A simple transfer learning method and its application to personalization in collaborative tagging[C]//Proc of the 9th IEEE Int Conf on Data Mining. Piscataway, NJ: IEEE, 2009: 219−228
    [28]
    Shi Yuan, Lan Zhenzhong, Liu Wei, et al. Extending semi-supervised learning methods for inductive transfer learning[C]//Proc of the 9th IEEE Int Conf on Data Mining. Piscataway, NJ: IEEE, 2009: 483−492
    [29]
    刘壮,刘畅,Wayne L,等. 用于金融文本挖掘的多任务学习预训练金融语言模型[J]. 计算机研究与发展,2021,58(8):1761−1772 doi: 10.7544/issn1000-1239.2021.20210298

    Liu Zhuang, Liu Chang, Wayne L, et al. Pretraining financial language model with multi-task learning for financial text mining[J]. Journal of Computer Research and Development, 2021, 58(8): 1761−1772 (in Chinese) doi: 10.7544/issn1000-1239.2021.20210298
  • Related Articles

    [1]He Xin, Gui Xiaolin, An Jian. A Distributed Area Coverage Algorithm Based on Delayed Awakening in Wireless Sensor Networks[J]. Journal of Computer Research and Development, 2011, 48(5): 786-792.
    [2]Xu Jia, Feng Dengguo, Su Purui. Research on Network-Warning Model Based on Dynamic Peer-to-Peer Overlay Hierarchy[J]. Journal of Computer Research and Development, 2010, 47(9): 1574-1586.
    [3]Xiong Wei, Xie Dongqing, Jiao Bingwang, Liu Jie. A Structured Peer to Peer File Sharing Model with Non-DHT Searching Algorithm[J]. Journal of Computer Research and Development, 2009, 46(3): 415-424.
    [4]Li Xiaolong, Lin Yaping, Hu Yupeng, Liu Yonghe. A Subset-Based Coverage-Preserving Distributed Scheduling Algorithm[J]. Journal of Computer Research and Development, 2008, 45(1): 180-187.
    [5]Hu Jinfeng, Hong Chunhui, Zheng Weimin. Granary: An Architecture of Object Oriented Internet Storage Service[J]. Journal of Computer Research and Development, 2007, 44(6): 1071-1079.
    [6]Zhang Sanfeng and Wu Guoxin. A Fault-Tolerant Asymmetric DHT Method Towards Dynamic and Heterogeneous Network[J]. Journal of Computer Research and Development, 2007, 44(6): 905-913.
    [7]Cao Jia, Lu Shiwen. Research on Topology Discovery in the Overlay Multicast[J]. Journal of Computer Research and Development, 2006, 43(5): 784-790.
    [8]Mao Yingchi, Liu Ming, Chen Lijun, Chen Daoxu, Xie Li. A Distributed Energy-Efficient Location-Independent Coverage Protocol in Wireless Sensor Networks[J]. Journal of Computer Research and Development, 2006, 43(2): 187-195.
    [9]Wen Yingyou, Zhao Jianli, Zhao Linliang, and Wang Guangxing. A Study of the Relationship Between Performance of Topology-Based MANET Routing Protocol and Network Coverage Density[J]. Journal of Computer Research and Development, 2005, 42(4): 684-689.
    [10]Zhou Jin and Li Yanda. A Peer-to-Peer DHT Algorithm Based on Small-World Network[J]. Journal of Computer Research and Development, 2005, 42(1): 109-117.
  • Cited by

    Periodical cited type(6)

    1. 徐雪峰,郭广伟,黄余. 改进全卷积神经网络的遥感图像小目标检测. 机械设计与制造. 2024(10): 38-42 .
    2. 刘雯雯,汪皖燕,程树林. 融合项目热门惩罚因子改进协同过滤推荐方法. 计算机技术与发展. 2023(03): 15-19 .
    3. 冯勇,刘洋,王嵘冰,徐红艳,张永刚. 面向用户需求的生成对抗网络多样性推荐方法. 小型微型计算机系统. 2023(06): 1192-1197 .
    4. 冯晨娇,宋鹏,张凯涵,梁吉业. 融合社交网络信息的长尾推荐方法. 模式识别与人工智能. 2022(01): 26-36 .
    5. 韩迪,陈怡君,廖凯,林坤玲. 推荐系统中的准确性、新颖性和多样性的有效耦合与应用. 南京大学学报(自然科学). 2022(04): 604-614 .
    6. 甘亚男,耿生玲,郝立. 超贝叶斯图模型及其联结树的构建. 青海师范大学学报(自然科学版). 2021(02): 42-48 .

    Other cited types(8)

Catalog

    Article views (242) PDF downloads (138) Cited by(14)

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return