ISSN 1000-1239 CN 11-1777/TP

Journal of Computer Research and Development ›› 2020, Vol. 57 ›› Issue (10): 2241-2250.doi: 10.7544/issn1000-1239.2020.20200463

Special Issue: 2020密码学与数据隐私保护研究专题

Previous Articles    

Efficient and Secure Federated Learning Based on Secret Sharing and Gradients Selection

Dong Ye1,2, Hou Wei1,2, Chen Xiaojun1, Zeng Shuai1   

  1. 1(Institute of Information Engineering, Chinese Academy of Sciences, Beijing 100195);2(School of Cyber Security, University of Chinese Academy of Sciences, Beijing 101408)
  • Online:2020-10-01

Abstract: In recent years, federated learning (FL) has been an emerging collaborative machine learning method where distributed users can train various models by only sharing gradients. To prevent privacy leakages from gradients, secure multi-party computation (MPC) has been considered as a promising guarantee recently. Meanwhile, some researchers proposed the Top-K gradients selection algorithm to reduce the traffic for synchronizing gradients among distributed users. However, there are few works that can balance the advantages of the two areas at present. We combine secret sharing with Top-K gradients selection to design efficient and secure federated learning protocols, so that we can cut down the communication overheads and improve the efficiency during the training phase while guaranteeing the users privacy and data security. Also, we propose an efficient method to construct message authentication code (MAC) to verify the validity of the aggregated results from the servers. And the communication overheads introduced by the MAC is small and independent of the number of shared gradients. Besides, we implement a prototype system. Compared with the plaintext training, on the one hand, our secure techniques introduce small additional overheads in communication and computation; On the other hand, we achieve the same level of accuracy as the plaintext training.

Key words: security, privacy, secret sharing, gradients selection, federated learning

CLC Number: