Abstract:
Federated learning is an emerging distributed machine learning method that enables mobile phones and IoT devices to learn a shared machine learning model with only transferring model parameters to protect private data. However, traditional federated learning models usually assume training data samples are independent and identically distributed (IID) on the local devices which are not feasible in the real-world, due to the data distributions are different in the different local devices. Hence, existing federated learning models cannot achieve satisfied performance for mixed distribution on the Non-IID data. In this paper, we propose a novel federated adaptive interaction model (FedAIM) for mixed distribution data that can jointly learn IID data and Non-IID data at the same time. In FedAIM, earth mover's distance (EMD) to measure the degree of bias for different client users is introduced for the first time. Then, an extremely biased server and a non-extremely biased server are built to separately process client users with different bias degrees. At the last, a new aggregation mechanism based on information entropy is designed to aggregate and interact model parameters to reduce the number of communication rounds among servers. The experimental results show that the FedAIM outperforms state-of-the-art methods on MNIST, CIFAR-10, Fashion-MNIST, SVHN and FEMNIST of real-world image datasets.