高级检索
    陈律君, 肖迪, 余柱阳, 黄会, 李敏. 基于秘密共享和压缩感知的通信高效联邦学习[J]. 计算机研究与发展, 2022, 59(11): 2395-2407. DOI: 10.7544/issn1000-1239.20220526
    引用本文: 陈律君, 肖迪, 余柱阳, 黄会, 李敏. 基于秘密共享和压缩感知的通信高效联邦学习[J]. 计算机研究与发展, 2022, 59(11): 2395-2407. DOI: 10.7544/issn1000-1239.20220526
    Chen Lüjun, Xiao Di, Yu Zhuyang, Huang Hui, Li Min. Communication-Efficient Federated Learning Based on Secret Sharing and Compressed Sensing[J]. Journal of Computer Research and Development, 2022, 59(11): 2395-2407. DOI: 10.7544/issn1000-1239.20220526
    Citation: Chen Lüjun, Xiao Di, Yu Zhuyang, Huang Hui, Li Min. Communication-Efficient Federated Learning Based on Secret Sharing and Compressed Sensing[J]. Journal of Computer Research and Development, 2022, 59(11): 2395-2407. DOI: 10.7544/issn1000-1239.20220526

    基于秘密共享和压缩感知的通信高效联邦学习

    Communication-Efficient Federated Learning Based on Secret Sharing and Compressed Sensing

    • 摘要: 深度学习技术的快速发展给我们带来了极大的便利,但同时也导致大量隐私数据的泄露.联邦学习允许客户端在只共享梯度的情况下联合训练模型,这看似解决了隐私信息泄露问题,但研究表明联邦学习框架中传输的梯度依然会导致隐私信息泄露.并且,联邦学习的高通信代价的特点难以适用于资源受限的环境.为此,提出了2个通信高效且安全的联邦学习算法,算法使用Top-K稀疏及压缩感知等技术以减少梯度传输造成的通信开销,另外利用安全多方计算中的加法秘密共享对重要的梯度测量值加密,以实现在减少通信开销的同时进一步增强其安全性.2个算法的主要区别是客户端与服务器通信时传递的分别为梯度测量值与梯度测量值的量化结果.在MNIST及Fashion-MNIST数据集上的实验表明,与其他算法相比,本文所提的算法在保证通信代价较低的情况下进一步增加了安全性,同时在模型准确性上也有较好的性能.

       

      Abstract: The rapid development of deep learning technology has brought us great convenience, but it also results in the disclosure of a large number of private data. Federated learning (FL) allows clients to jointly train models only by sharing gradients, which seems to solve the privacy information leakage problem, but some research show that gradients transmitted in FL frameworks can still result in privacy information leakage. Moreover, the high communication cost of FL is difficult to apply to resource-constrained environments. Therefore, we put forward two communication-efficient and secure FL algorithms, which use Top-K sparse and compressed sensing to reduce communication overhead caused by the gradient transmission, and further use additive secret sharing in secure multi-party computation (MPC) to encrypt the important gradient parameter measurements in order to simultaneously realize communication overhead reduction and security enhancement. The main difference between the two algorithms is that the client and server communicate with each other by transmitting the gradient measurement value and the quantization result of the gradient measurement value, respectively. Experiments on MNIST and Fashion-MNIST data show that, compared with other algorithms, the proposed algorithms can further increase the security at a lower communication cost and have better performance in model accuracy.

       

    /

    返回文章
    返回