Abstract:
Differential evolution (DE) is a simple and efficient population-based stochastic real-parameter optimization algorithm, which has been applied to a wide range of complex optimization problems. However, the standard DE has some drawbacks, such as premature convergence, low convergence precision and slow convergence rate in the later stage of evolution. To deal with these drawbacks, a novel hybrid differential evolution algorithm (DEGSA-SL) is proposed by combining the advantage of gravitational search algorithm (GSA). In the proposed algorithm, a new operator called threshold statistical learning is designed. By introducing the operator, the better strategy of DE and GSA can be selected adaptively, by learning from the previous success ratio of the two strategies, to produce next generation at each iteration in the evolution process. It takes full use of the potential of DE and GSA, ensures the balance between global exploration and local exploitation abilities in the solution spaces, and improves the global search capabilities of the standard DE algorithm. Several complex benchmark functions are employed to test the performance of the DEGSA-SL. The results show that the proposed algorithm not only achieves better convergence precision, robustness and convergence rate, but also avoids the premature convergence problem effectively.