• 中国精品科技期刊
  • CCF推荐A类中文期刊
  • 计算领域高质量科技期刊T1类
Advanced Search
Guo Songyue, Wang Yangqian, Bai Siyuan, Liu Yongheng, Zhou Jun, Wang Mengge, Liao Qing. Federated Adaptive Interaction Model for Mixed Distribution Data[J]. Journal of Computer Research and Development, 2023, 60(6): 1346-1357. DOI: 10.7544/issn1000-1239.202111090
Citation: Guo Songyue, Wang Yangqian, Bai Siyuan, Liu Yongheng, Zhou Jun, Wang Mengge, Liao Qing. Federated Adaptive Interaction Model for Mixed Distribution Data[J]. Journal of Computer Research and Development, 2023, 60(6): 1346-1357. DOI: 10.7544/issn1000-1239.202111090

Federated Adaptive Interaction Model for Mixed Distribution Data

Funds: This work was supported by the Guangdong Major Project of Basic and Applied Basic Research (2019B030302002).
More Information
  • Author Bio:

    Guo Songyue: born in 1998. Master candidate. Student member of CCF. His main research interests include federated learning and data mining

    Wang Yangqian: born in 1996. Master. His main research interests include federated learning, privacy preserving, and voiceprint recognition

    Bai Siyuan: born in 1998. Bachelor, research and development engineer. His main research interests include federated learning and machine learning

    Liu Yongheng: born in 1982. Master, senior engineer. His main research interests include cloud computing and system optimizing

    Zhou Jun: born in 1978. Master, senior engineer. His main research interests include artificial intelligence and privacy computing

    Wang Mengge: born in 1997. Master, data analyst. Her main research interests include artificial intelligence and privacy computing

    Liao Qing: born in 1988. PhD, professor. Senior member of CCF. Her main research interests include artificial intelligence and data mining

  • Received Date: November 07, 2021
  • Revised Date: August 28, 2022
  • Available Online: March 16, 2023
  • Federated learning is an emerging distributed machine learning method that enables mobile phones and IoT devices to learn a shared machine learning model with only transferring model parameters to protect private data. However, traditional federated learning models usually assume training data samples are independent and identically distributed (IID) on the local devices which are not feasible in the real-world, due to the data distributions are different in the different local devices. Hence, existing federated learning models cannot achieve satisfied performance for mixed distribution on the Non-IID data. In this paper, we propose a novel federated adaptive interaction model (FedAIM) for mixed distribution data that can jointly learn IID data and Non-IID data at the same time. In FedAIM, earth mover's distance (EMD) to measure the degree of bias for different client users is introduced for the first time. Then, an extremely biased server and a non-extremely biased server are built to separately process client users with different bias degrees. At the last, a new aggregation mechanism based on information entropy is designed to aggregate and interact model parameters to reduce the number of communication rounds among servers. The experimental results show that the FedAIM outperforms state-of-the-art methods on MNIST, CIFAR-10, Fashion-MNIST, SVHN and FEMNIST of real-world image datasets.

  • [1]
    微众银行. 联邦学习白皮书 V2.0 [EB/OL]. 2020[2020-11-01].https: //www.fedai.org/#

    WeBank. White paper V2.0 of federated learning [EB/OL]. 2020 [2020-11-01].https://www.fedai.org/# (in Chinese)
    [2]
    芦效峰,廖钰盈,Lio P,等. 一种面向边缘计算的高效异步联邦学习机制[J]. 计算机研究与发展,2020,57(12):2571−2582 doi: 10.7544/issn1000-1239.2020.20190754

    Lu Xiaofeng, Liao Yuying, Lio P, et al. An asynchronous federated learning mechanism for edge network computing[J]. Journal of Computer Research and Development, 2020, 57(12): 2571−2582 (in Chinese) doi: 10.7544/issn1000-1239.2020.20190754
    [3]
    董业,侯炜,陈小军,等. 基于秘密分享和梯度选择的高效安全联邦学习[J]. 计算机研究与发展,2020,57(10):2241−2250 doi: 10.7544/issn1000-1239.2020.20200463

    Dong Ye, Hou Wei, Chen Xiaojun, et al. Efficient and secure federated learning based on secret sharing and gradients selection[J]. Journal of Computer Research and Development, 2020, 57(10): 2241−2250 (in Chinese) doi: 10.7544/issn1000-1239.2020.20200463
    [4]
    Yang Qiang, Liu Yang, Chen Tianjian, et al. Federated machine learning: Concept and applications[J]. ACM Transactions on Intelligent Systems and Technology, 2019, 10(2): 1−19
    [5]
    Zhao Yue, Li Ming, Lai Liangzhen, et al. Federated learning with Non-IID data [J]. arxiv preprint, arXiv: 1806.00582, 2018
    [6]
    Huang Xixi, Jian Guan, Zhang Bin, et al. Differentially private convolutional neural networks with adaptive gradient descent [C] //Proc of the 4th IEEE Int Conf on Data Science in Cyberspace (DSC). Piscataway, NJ: IEEE, 2019: 642−648
    [7]
    Huang Xixi, Ding Ye, Zoe L, et al. DP-FL: A novel differentially private federated learning framework for the unbalanced data[J]. World Wide Web, 2020, 23(4): 2529−2545 doi: 10.1007/s11280-020-00780-4
    [8]
    Chen Yujing, Ning Yue, Slawski M, et al. Asynchronous online federated learning for edge devices with Non-IID data [C] //Proc of the 8th IEEE Int Conf on Big Data. Piscataway, NJ: IEEE, 2020: 15−24
    [9]
    Kairouz P, McMahan H, Avent B, et al. Advances and open problems in federated learning[J]. Foundations and Trends in Machine Learning, 2021, 14(1): 1−210
    [10]
    McMahan B, Moore E, Ramage D, et al. Federated learning: Communication-efficient learning of deep networks from decentralized data [C] //Proc of the 20th Int Conf on Artificial Intelligence and Statistics. Cambridge, MA: MIT, 2017: 1273−1282
    [11]
    Sattler F, Wiedemann S, Muller K, et al. Robust and communication-efficient federated learning from Non-IID[J]. IEEE Transactions on Neural Networks and Learning Systems, 2019, 31(9): 3400−3413
    [12]
    Li Tian, Sahu A, Zaheer M, et al 2020. Federated optimization in heterogeneous networks [C] //Proc of the 3rd Conf on Machine Learning and Systems (MLSys). Cambridge, MA: MIT, 2020 : 429−450
    [13]
    Smith V, Chiang C, Sanjabi M, et al. Federated multi-task learning [C] //Proc of the 31st Neural Information Processing Systems (NIPS). Cambridge, MA: MIT, 2017: 4424−4434
    [14]
    Duan M, Liu Duo, Chen Xianzhang, et al. Astraea: Self-balancing federated learning for improving classification accuracy of mobile deep learning applications [C] //Proc of the 37th IEEE Int Conf on Computer Design (ICCD). Piscataway, NJ: IEEE, 2019 : 246−254
    [15]
    Zhang Wwenyu, Wang Xiumin, Zhou Pan, et al. Client selection for federated learning with Non-IID data in mobile edge computing[J]. IEEE Access, 2021, 9: 24462−24474 doi: 10.1109/ACCESS.2021.3056919
    [16]
    Rubner Y, Tomasi C, Guibas J, et al. The earth mover’s distance as a metric for image retrieval[J]. International Journal of Computer Vision, 2000, 40(2): 99−121 doi: 10.1023/A:1026543900054
    [17]
    Ren Jinfu, Liu Yang, Liu Jiming. EWGAN: Entropy-based wasserstein gan for imbalanced learning [C] //Proc of the 33rd AAAI Conf on Artificial Intelligence (AAAI). Palo Alto, CA: AAAI, 2019: 10011−10012
    [18]
    Yao Xin, Huang Tianchi, Wu Chenglei, et al. Towards faster and better federated learning: A feature fusion approach [C] //Proc of the 26th IEEE Int Conf on Image Processing (ICIP). Piscataway, NJ: IEEE, 2019: 175−179
    [19]
    Li Xiang, Huang Kaixuan, Yang Wenhao, et al. On the convergence of fedavg on Non-IID data [C/OL] //Proc of the 8th Int Conf on Learning Representations (ICLR). Piscataway, NJ: IEEE, 2020[2021-08-17].https://arxiv.org/abs/1907.02189
    [20]
    Meng Fanxu, Cheng Hao, Li Ke, et al. Filter grafting for deep neural networks [C] //Proc of the 22nd IEEE/CVF Conf on Computer Vision and Pattern Recognition (CVPR). Piscataway, NJ: IEEE, 2020 : 6599−6607
    [21]
    Cheng Hao, Lian Dongze, Gao Shenghua, et al. Utilizing information bottleneck to evaluate the capability of deep neural networks for image classification[J]. Entropy, 2019, 21(5): 456−456 doi: 10.3390/e21050456
    [22]
    Shwartz-Ziv R, Tishby N. Opening the black box of deep neural networks via information [J]. arXiv preprint, arXiv: 1703.00810, 2017
    [23]
    Lecun Y, Bottou L, Bengio Y, et al. Gradient-based learning applied to document recognition[J]. Proceedings of the IEEE, 1998, 86(11): 2278−2324 doi: 10.1109/5.726791
    [24]
    LeCun Y, Bottou L, Orr G, et al. Efficient backprop [G] //LNCS 2172: Proc of the 3rd Int Conf on Coopertive Information Systems. Berlin: Springer, 1998: 9−48
    [25]
    He Kaiming, Zhang Xiangyu, Ren Shaoqing, et al. Deep residual learning for image recognition [C] //Proc of the 21st IEEE/CVF Conf on Computer Vision and Pattern Recognition (CVPR). Piscataway, NJ: IEEE, 2016: 770−778
    [26]
    Wang Jianyu, Liu Qinghua, Liang Hao, et al. Tackling the objective inconsistency problem in heterogeneous federated optimization [C] //Proc of the 34th Neural Information Processing Systems (NIPS). Cambridge, MA: MIT, 2020: 7611−7623
    [27]
    Karimireddy S, Kale S, Mohri M, et al. Scaffold: Stochastic controlled averaging for federated learning [C] //Proc of the 37th Int Conf on Machine Learning (ICML). New York: ACM, 2020: 5132−5143
  • Cited by

    Periodical cited type(4)

    1. 杨洁祎 ,董一鸿 ,钱江波 . 基于图神经网络的小样本学习方法研究进展. 计算机研究与发展. 2024(04): 856-876 . 本站查看
    2. 秦志龙,邓琨,刘星妍. 基于元路径卷积的异构图神经网络算法. 电信科学. 2024(03): 89-103 .
    3. 白宇康,陈彦敏,樊小超,孙睿军,李炜杰. 图神经网络和数值诱导正则化的数值推理方法. 智能系统学报. 2024(05): 1268-1276 .
    4. 陈东洋,郭进利. 基于图注意力的高阶网络节点分类方法. 计算机应用研究. 2023(04): 1095-1100+1136 .

    Other cited types(4)

Catalog

    Article views (273) PDF downloads (147) Cited by(8)

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return