高级检索
    程大宁, 张汉平, 夏粉, 李士刚, 袁良, 张云泉. AccSMBO:一种基于超参梯度和元学习的SMBO加速算法[J]. 计算机研究与发展, 2020, 57(12): 2596-2609. DOI: 10.7544/issn1000-1239.2020.20190670
    引用本文: 程大宁, 张汉平, 夏粉, 李士刚, 袁良, 张云泉. AccSMBO:一种基于超参梯度和元学习的SMBO加速算法[J]. 计算机研究与发展, 2020, 57(12): 2596-2609. DOI: 10.7544/issn1000-1239.2020.20190670
    Cheng Daning, Zhang Hanping, Xia Fen, Li Shigang, Yuan Liang, Zhang Yunquan. AccSMBO: Using Hyperparameters Gradient and Meta-Learning to Accelerate SMBO[J]. Journal of Computer Research and Development, 2020, 57(12): 2596-2609. DOI: 10.7544/issn1000-1239.2020.20190670
    Citation: Cheng Daning, Zhang Hanping, Xia Fen, Li Shigang, Yuan Liang, Zhang Yunquan. AccSMBO: Using Hyperparameters Gradient and Meta-Learning to Accelerate SMBO[J]. Journal of Computer Research and Development, 2020, 57(12): 2596-2609. DOI: 10.7544/issn1000-1239.2020.20190670

    AccSMBO:一种基于超参梯度和元学习的SMBO加速算法

    AccSMBO: Using Hyperparameters Gradient and Meta-Learning to Accelerate SMBO

    • 摘要: 为了利用最佳超参高概率范围和超参梯度,提出了加速的序列模型优化算法(sequential model-based optimization algorithms, SMBO)——AccSMBO算法.AccSMBO使用了具有良好抗噪能力的基于梯度的多核高斯过程回归方法,利用元学习数据集的meta-acquisition函数.AccSMBO自然对应的并行算法则使用了基于元学习数据集的并行算法资源调度方案.基于梯度的多核高斯过程回归可以避免超参梯度噪音对拟合高斯过程的影响,加快构建较好超参-效果模型的速度.meta-acquisition函数通过读取元学习数据集,总结最佳超参高概率范围,加快最优超参搜索.在AccSMBO自然对应的并行算法中,并行资源调度方法使更多的并行计算资源用于计算最佳超参高概率范围中的超参,更快探索最佳超参高概率范围.上述3个技术充分利用超参梯度和最佳超参高概率范围加速SMBO算法.在实验中,相比于基于传统的SMBO算法实现的SMAC(sequential model-based algorithm configuration)算法、基于梯度下降的HOAG(hyperparameter optimization with approximate gradient)算法和常用的随机搜索算法,AccSMBO使用最少的资源找到了效果最好的超参.

       

      Abstract: Current machine learning models require numbers of hyperparameters. Adjusting those hyperparameters is an exhausting job. Thus, hyperparameters optimization algorithms play important roles in machine learning application. In hyperparameters optimization algorithms, sequential model-based optimization algorithms (SMBO) and parallel SMBO algorithms are state-of-the-art hyperpara-meter optimization methods. However, (parallel) SMBO algorithms do not take the best hyperpara-meters high possibility range and gradients into considerasion. It is obvious that best hyperparameters high possibility range and hyperparameter gradients can accelerate traditional hyperparameters optimization algorithms. In this paper, we accelerate the traditional SMBO method and name our method as AccSMBO. In AccSMBO, we build a novel gradient-based multikernel Gaussian process. Our multikernel Gaussian process has a good generalization ability which reduces the gradient noise influence on SMBO algorithm. And we also design meta-acquisition function and parallel resource allocation plan which encourage that (parallel) SMBO puts more attention on the best hyperpara-meters high possibility range. In theory, our method ensures that all hyperparameter gradient information and the best hyperparameters high possibility range information are fully used. In L2 norm regularised logistic loss function experiments, on different scales datasets: small-scale dataset Pc4, middle-scale dataset Rcv1, large-scale dataset Real-sim, compared with state-of-the-art gradient based algorithm: HOAG and state-of-the-art SMBO algorithm: SMAC, our method exhibits the best performance.

       

    /

    返回文章
    返回