Federated Learning Adaptive Gradient Accumulation Backdoor Attack
-
-
Abstract
This paper proposes an adaptive gradient accumulation backdoor attack framework AGABA for federated learning system. This method combines the adaptive subblock trigger (AST) and multi-stage gradient accumulation mechanism (MGA), which effectively solves the problem of the balance between concealment and persistence of traditional backdoor attacks in the federated environment. AST decomposes the complete trigger into multiple independent components through dynamic transparency control and distributed sub block superposition technology, so that malicious clients can maintain a high degree of concealment while building a global trigger mode. MGA a three-stage attack strategy (initial accumulation, gradient accumulation and attack Implementation) combined with the important perception mechanism of parameters is used to realize the latency and activation of malicious updates in model aggregation through the gradual gradient accumulation across rounds. The framework uses momentum accelerated gradient difference propagation and adaptive memory factor adjustment to ensure that the attack gradient is always within the legal distribution range, effectively avoiding the detection mechanism based on statistical anomalies. Experiments show that AGABA can still maintain a good success rate of backdoor attacks under the protection of a variety of mainstream defense mechanisms in the scenario of 20% malicious clients' participation, which is better than the existing single attack method.
-
-