高级检索

    基于阈值统计学习的差分进化引力搜索算法

    Hybrid Differential Evolution Gravitation Search Algorithm Based on Threshold Statistical Learning

    • 摘要: 为了改善基本差分进化算法在求解复杂优化问题时易出现早熟收敛、求解精度低以及进化后期收敛速度慢等缺陷,结合引力搜索算法的优点,提出一种基于阈值统计学习思想的混合差分进化引力搜索算法.该算法通过阈值统计学习的方式,充分利用差分进化算法的全局优化能力与引力搜索算法在进化后期的种群开发能力,在进化过程中根据2种策略在先前学习代数的成功率自适应选择较优策略生成下一代群体,保证种群在解空间中的探索与开发能力之间的平衡,以提高算法的全局寻优能力.对几个经典复杂测试函数的仿真结果表明:改进算法求解精度高、收敛速度快、鲁棒性强、能够有效避免早熟收敛问题.

       

      Abstract: Differential evolution (DE) is a simple and efficient population-based stochastic real-parameter optimization algorithm, which has been applied to a wide range of complex optimization problems. However, the standard DE has some drawbacks, such as premature convergence, low convergence precision and slow convergence rate in the later stage of evolution. To deal with these drawbacks, a novel hybrid differential evolution algorithm (DEGSA-SL) is proposed by combining the advantage of gravitational search algorithm (GSA). In the proposed algorithm, a new operator called threshold statistical learning is designed. By introducing the operator, the better strategy of DE and GSA can be selected adaptively, by learning from the previous success ratio of the two strategies, to produce next generation at each iteration in the evolution process. It takes full use of the potential of DE and GSA, ensures the balance between global exploration and local exploitation abilities in the solution spaces, and improves the global search capabilities of the standard DE algorithm. Several complex benchmark functions are employed to test the performance of the DEGSA-SL. The results show that the proposed algorithm not only achieves better convergence precision, robustness and convergence rate, but also avoids the premature convergence problem effectively.

       

    /

    返回文章
    返回