Abstract:
With the development of AI democratization, deep neural networks (DNNs) have been widely applied to edge devices, such as smart phones and automated driving, etc. Stochastic computing (SC) as a promising technique performs fundamental machine learning (ML) tasks using simple logic gates instead of complicated binary arithmetic circuits. SC has advantages of low-power and low-cost DNNs execution on edge devices with constrained resources (e.g., energy, computation and memory units, etc.). However, previous SC work only designs one group of setting for fixed hardware implementation, ignoring the dynamic hardware resources (e.g., battery), which leads to low hardware efficiency and short battery life. In order to save energy for battery-powered edge devices, dynamic voltage and frequency scaling (DVFS) technique is widely used for hardware reconfiguration to prolong battery life. In this paper, we creatively propose a run-time reconfigurable framework, namely RR-SC, for SC-based DNNs and first attempt to combine hardware and software reconfigurations to satisfy the time constraint of inference and maximally save energy. RR-SC using reinforcement learning (RL) can generate multiple groups of model settings at one time, which can satisfy the accuracy constraints under different hardware settings (i.e., different voltage/frequency levels). The solution has the best accuracy and hardware efficiency trade-off. Meanwhile, the model settings are switched on a backbone model at run-time, which enables lightweight software reconfiguration. Experimental results show that RR-SC can switch the lightweight settings within 110 ms to guarantee the required real-time constraint at different hardware levels. Meanwhile, it can achieve up to 7.6 times improvement for the number of model inference with only 1% accuracy loss.