SpikingReLU: A High-Performance Spiking Neural Network Based on Hybrid Model Conversion
-
-
Abstract
Brain-inspired spiking neural networks (SNNs) have shown considerable promise due to their strong spatiotemporal encoding capabilities and event-driven computation. Current mainstream SNN approaches can be divided into two categories: converting pre-trained Artificial Neural Networks (ANNs) into SNNs (ANN conversion methods) and directly training SNNs. However, ANN conversion methods require large time steps to reduce conversion errors, while directly trained SNNs suffer from poor representational capacity. To address these challenges, we propose a high-performance spiking neural network based on hybrid model conversion. The proposed method decouples model training and inference. During the training phase, a portion of the spiking neurons in the SNNs are strategically replaced with ReLU activation functions to enhance the feature learning ability, thereby constructing a high-performance hybrid model that integrates both ANN and SNN components. Since the ANN part of the hybrid model cannot perform event-driven computation, we further introduce a reparameterization technique to convert the entire inference process of the hybrid model into event-driven computation without performance loss. Therefore, the proposed method combines the advantages of high performance and event-driven computation. Experimental results show that the proposed method achieves classification accuracies of 97.31% and 83.34% on CIFAR-10 and CIFAR-100, respectively, 70.89% on ImageNet, and 82.71% on the neuromorphic dataset CIFAR10-DVS, outperforming state-of-the-art methods.
-
-