Advanced Search
    Chen Lüjun, Xiao Di, Yu Zhuyang, Huang Hui, Li Min. Communication-Efficient Federated Learning Based on Secret Sharing and Compressed Sensing[J]. Journal of Computer Research and Development, 2022, 59(11): 2395-2407. DOI: 10.7544/issn1000-1239.20220526
    Citation: Chen Lüjun, Xiao Di, Yu Zhuyang, Huang Hui, Li Min. Communication-Efficient Federated Learning Based on Secret Sharing and Compressed Sensing[J]. Journal of Computer Research and Development, 2022, 59(11): 2395-2407. DOI: 10.7544/issn1000-1239.20220526

    Communication-Efficient Federated Learning Based on Secret Sharing and Compressed Sensing

    • The rapid development of deep learning technology has brought us great convenience, but it also results in the disclosure of a large number of private data. Federated learning (FL) allows clients to jointly train models only by sharing gradients, which seems to solve the privacy information leakage problem, but some research show that gradients transmitted in FL frameworks can still result in privacy information leakage. Moreover, the high communication cost of FL is difficult to apply to resource-constrained environments. Therefore, we put forward two communication-efficient and secure FL algorithms, which use Top-K sparse and compressed sensing to reduce communication overhead caused by the gradient transmission, and further use additive secret sharing in secure multi-party computation (MPC) to encrypt the important gradient parameter measurements in order to simultaneously realize communication overhead reduction and security enhancement. The main difference between the two algorithms is that the client and server communicate with each other by transmitting the gradient measurement value and the quantization result of the gradient measurement value, respectively. Experiments on MNIST and Fashion-MNIST data show that, compared with other algorithms, the proposed algorithms can further increase the security at a lower communication cost and have better performance in model accuracy.
    • loading

    Catalog

      Turn off MathJax
      Article Contents

      /

      DownLoad:  Full-Size Img  PowerPoint
      Return
      Return