Abstract:
In the mobile edge computing, the local device can offload tasks to the server near the edge of the network for data storage and computation processing, thereby reducing the delay and power consumption of the service. Therefore, the task offloading decision has great research value. This paper first constructs an offloading model with multi-service nodes and multi-dependencies within mobile tasks in large-scale heterogeneous mobile edge computing. Then, an improved deep reinforcement learning algorithm is proposed to optimize the task offloading strategy by combining the actual application scenarios of mobile edge computing. Finally, the advantages and disadvantages of each offloading strategy are analyzed by comprehensively comparing the energy consumption, cost, load balancing, delay, network usage and average execution time. The simulation results show that the improved HERDRQN algorithm based on long short-term memory (LSTM) network and HER (hindsight experience replay) has good effects on energy consumption, cost, load balancing and delay. In addition, this paper uses various algorithm strategies to offload a certain number of applications, and compares the number distribution of heterogeneous devices under different CPU utilizations to verify the relationship between the offloading strategy and each evaluation index, so as to prove that the strategy generated by HERDRQN algorithm is scientific and effective in solving the task offloading problem.