Abstract:
Consensus mechanism is an important part of blockchain technology, but the mainstream consensus mechanisms, especially proof-of-work consensus mechanisms, suffer from problems such as wasted computing power and low throughput. Federated learning as a distributed machine learning method, the local training of learning models and the final calculation of participant contributions require a large amount of computing power. Therefore, we propose a trusted and fair blockchain framework, called TFchain, supporting adaptive federated learning tasks to explore how to utilize the wasted arithmetic power in the original consensus mechanism to improve the efficiency of federated learning. First, we design a new consensus mechanism PoTF (proof of trust and fair) based on blockchain and federated learning, which sets the nodes of the blockchain as the participants of federated learning and transfers a large amount of ineffective arithmetic power used in the original consensus mechanism for Hash computation to federated learning for training of local models and evaluation of participants’ contributions. Second, while improving the throughput of blockchain transactions, the participants of federated learning are evaluated and incentivized with reasonable contributions. Finally, an algorithm is designed to prevent nodes from being evil. The experimental results show that the TFchain proposed in this paper can effectively improve the transaction processing performance of the blockchain while recycling the arithmetic power, and provide effective positive incentives to the participants who actively participate in federated learning.