• 中国精品科技期刊
  • CCF推荐A类中文期刊
  • 计算领域高质量科技期刊T1类
Advanced Search
Li Gengsong, Liu Yi, Zheng Qibin, Li Xiang, Liu Kun, Qin Wei, Wang Qiang, Yang Changhong. Algorithm Selection Method Based on Multi-Objective Hybrid Ant Lion Optimizer[J]. Journal of Computer Research and Development, 2023, 60(7): 1533-1550. DOI: 10.7544/issn1000-1239.202220769
Citation: Li Gengsong, Liu Yi, Zheng Qibin, Li Xiang, Liu Kun, Qin Wei, Wang Qiang, Yang Changhong. Algorithm Selection Method Based on Multi-Objective Hybrid Ant Lion Optimizer[J]. Journal of Computer Research and Development, 2023, 60(7): 1533-1550. DOI: 10.7544/issn1000-1239.202220769

Algorithm Selection Method Based on Multi-Objective Hybrid Ant Lion Optimizer

Funds: This work was supported by the Science and Technology Innovation 2030 Major Project of China (2020AAA0104802), the National Natural Science Foundation of China (91948303), and the National Natural Science Foundation of China for Young Scientists (61802426).
More Information
  • Author Bio:

    Li Gengsong: born in 1999. Master. His main research interests include algorithm selection and big data

    Liu Yi: born in 1990. PhD, assistant professor. His main research interests include robot operating system, big data technologies, and evolutionary algorithms.

    Zheng Qibin: born in 1990. PhD, assistant professor. His main research interests include data engineering, data mining, and machine learning

    Li Xiang: born in 1988. PhD, assistant professor. His main research interest includes big data

    Liu Kun: born in 1982. PhD, associate professor. His main research interest includes big data

    Qin Wei: born in 1983. Master, assistant professor. His main research interest includes intelligent information system management

    Wang Qiang: born in 1972. Master, associate professor. His main research interest includes big data

    Yang Changhong: born in 1967. Master, senior engineer. His main research interest includes computer software

  • Received Date: August 27, 2022
  • Revised Date: January 31, 2023
  • Available Online: April 17, 2023
  • Algorithm selection refers to selecting an algorithm that satisfies the requirements for a given problem from feasible algorithms, and algorithm selection based on meta-learning is a widely used method, in which the key components are meta-features and meta-learners. However, existing research is difficult to make full use of the complementarity of meta-features and the diversity of meta-learners, which are not conducive to further improving the method performance. To solve the above problems, a selective ensemble algorithm selection method based on multi-objective hybrid ant lion optimizer (SAMO) is proposed. It designs an algorithm selection model, which sets the accuracy and diversity of the ensemble meta-learners as the optimization objectives, introduces meta-feature selection and selective ensemble, and chooses meta-features and heterogeneous meta-learners simultaneously to construct ensemble meta-learners; it proposes a multi-objective hybrid ant lion optimizer to optimize the model, which uses discrete code to select meta-feature subsets and constructs ensemble meta-learners by continuous code, and applies the enhanced walk strategy and the preference elite selection mechanism to improve the optimization performance. We utilize 260 datasets, 150 meta-features, and 9 candidate algorithms to construct classification algorithm selection problems and conduct test experiments, and the parameter sensitivity of the method is analyzed, the multi-objective hybrid ant lion optimizer is compared with four evolutionary algorithms, 8 comparative methods are compared with the proposed method, and the results verify the effectiveness and superiority of the method.

  • [1]
    Adam S P, Alexandropoulos S-A N, Pardalos P M, et al. No free lunch theorem: A review[M]//Approximation and Optimization. Cham, Switzerland: Springer, 2019: 57−82
    [2]
    Kerschke P, Hoos H H, Neumann F, et al. Automated algorithm selection: Survey and perspectives[J]. Evolutionary Computation, 2019, 27(1): 3−45 doi: 10.1162/evco_a_00242
    [3]
    Brazdil P, Giraud-Carrier C. Metalearning and algorithm selection: Progress, state of the art and introduction to the 2018 special issue[J]. Machine Learning, 2018, 107(1): 1−14 doi: 10.1007/s10994-017-5692-y
    [4]
    Yang Chengrun, Akimoto Y, Kim D W, et al. OBOE: Collaborative filtering for AutoML model selection[C]//Proc of the 25th ACM SIGKDD Int Conf on Knowledge Discovery & Data Mining. New York: ACM, 2019: 1173−1183
    [5]
    Dias L V, Miranda P B C, Nascimento A C A, et al. ImageDataset2Vec: An image dataset embedding for algorithm selection[J]. Expert Systems with Applications, 2021, 180: 115053 doi: 10.1016/j.eswa.2021.115053
    [6]
    Shahoud S, Winter M, Khalloof H, et al. An extended meta learning approach for automating model selection in big data environments using microservice and container virtualizationz technologies[J]. Internet of Things, 2021, 16: 100432 doi: 10.1016/j.iot.2021.100432
    [7]
    Aguiar G J, Santana E J, De Carvalho A C P F L, et al. Using meta-learning for multi-target regression[J]. Information Sciences, 2022, 584: 665−684 doi: 10.1016/j.ins.2021.11.003
    [8]
    Arjmand A, Samizadeh R, Dehghani Saryazdi M. Meta-learning in multivariate load demand forecasting with exogenous meta-features[J]. Energy Efficiency, 2020, 13(5): 871−887 doi: 10.1007/s12053-020-09851-x
    [9]
    Li Li, Wang Yong, Xu Ying, et al. Meta-learning based industrial intelligence of feature nearest algorithm selection framework for classification problems[J]. Journal of Manufacturing Systems, 2022, 62: 767−776 doi: 10.1016/j.jmsy.2021.03.007
    [10]
    Chalé M, Bastian N D, Weir J. Algorithm selection framework for cyber attack detection[C]//Proc of the 2nd ACM Workshop on Wireless Security and Machine Learning. New York: ACM, 2020: 37−42
    [11]
    Mu Tianyu, Wang Hongzhi, Zheng Shenghe, et al. Assassin: An automatic classification system based on algorithm selection[J]. Proceedings of the VLDB Endowment, 2021, 14(12): 2751−2754 doi: 10.14778/3476311.3476336
    [12]
    Garcia L P F, Lorena A C, De Souto M C P, et al. Classifier recommendation using data complexity measures[C]//Proc of the 24th Int Conf on Pattern Recognition. Piscataway, NJ: IEEE, 2018: 874−879
    [13]
    Aguiar G J, Mantovani R G, Mastelini S M, et al. A meta-learning approach for selecting image segmentation algorithm[J]. Pattern Recognition Letters, 2019, 128: 480−487 doi: 10.1016/j.patrec.2019.10.018
    [14]
    Aduviri R, Matos D, Villanueva E. Feature selection algorithm recommendation for gene expression data through gradient boosting and neural network metamodels[C]//Proc of the 12th IEEE Int Conf on Bioinformatics and Biomedicine. Los Alamitos, CA: IEEE Computer Society, 2018: 2726−2728
    [15]
    孟军,张晶,姜丁菱,等. 结合近邻传播聚类的选择性集成分类方法[J]. 计算机研究与发展,2018,55(5):986−993 doi: 10.7544/issn1000-1239.2018.20170077

    Meng Jun, Zhang Jing, Jiang Dingling, et al. Selective ensemble classification integrated with affinity propagation clustering[J]. Journal of Computer Research and Development, 2018, 55(5): 986−993 (in Chinese) doi: 10.7544/issn1000-1239.2018.20170077
    [16]
    Jan Z, Munos J C, Ali A. A novel method for creating an optimized ensemble classifier by introducing cluster size reduction and diversity[J]. IEEE Transactions on Knowledge and Data Engineering, 2020, 34(7): 3072−3081
    [17]
    Xu Yuhong, Yu Zhiwen, Cao Wenming, et al. Adaptive classifier ensemble method based on spatial perception for high-dimensional data classification[J]. IEEE Transactions on Knowledge and Data Engineering, 2021, 33(7): 2847−2862 doi: 10.1109/TKDE.2019.2961076
    [18]
    Mohammed A M, Onieva E, Woźniak M, et al. An analysis of heuristic metrics for classifier ensemble pruning based on ordered aggregation[J]. Pattern Recognition, 2022, 124: 108493 doi: 10.1016/j.patcog.2021.108493
    [19]
    胡毅,瞿博阳,梁静,等. 进化集成学习算法综述[J]. 智能科学与技术学报,2021,3(1):18−33 doi: 10.11959/j.issn.2096-6652.202103

    Hu Yi, Qu Boyang, Liang Jing, et al. A survey on evolutionary ensemble learning algorithm[J]. Chinese Journal of Intelligent Science and Technology, 2021, 3(1): 18−33 (in Chinese) doi: 10.11959/j.issn.2096-6652.202103
    [20]
    刘艺,刁兴春,曹建军,等. 基于集成分类的高维数据实体分辨[J]. 计算机应用研究,2018,35(3):689−693

    Liu Yi, Diao Xingchun, Cao Jianjun, et al. High-dimensional data entity resolution based on ensemble classifying[J]. Application Research of Computers, 2018, 35(3): 689−693 (in Chinese)
    [21]
    Qasem A, Sheikh Abdullah S N H, Sahran S, et al. An improved ensemble pruning for mammogram classification using modified bees algorithm[J]. Neural Computing and Applications, 2022, 34: 10093−10116 doi: 10.1007/s00521-022-06995-y
    [22]
    Zhu Xuhui, Ni Zhiwei, Ni Liping, et al. Ensemble pruning of ELM via migratory binary glowworm swarm optimization and margin distance minimization[J]. Neural Processing Letters, 2020, 52(3): 2043−2067 doi: 10.1007/s11063-020-10336-2
    [23]
    Mirjalili S. The ant lion optimizer[J]. Advances in Engineering Software, 2015, 83(C): 80−98
    [24]
    Niu Guoqiang, Li Xiaoyong, Wan Xin, et al. Dynamic optimization of wastewater treatment process based on novel multi-objective ant lion optimization and deep learning algorithm[J]. Journal of Cleaner Production, 2022, 345: 131140 doi: 10.1016/j.jclepro.2022.131140
    [25]
    Abualigah L, Shehab M, Alshinwan M, et al. Ant lion optimizer: A comprehensive survey of its variants and applications[J]. Archives of Computational Methods in Engineering, 2021, 28(3): 1397−1416 doi: 10.1007/s11831-020-09420-6
    [26]
    Liu Yi, Qin Wei, Zhang Jinhui, et al. Multi-objective ant lion optimizer based on time weight[J]. IEICE Transactions on Information and Systems, 2021, E104.D(6): 901−904 doi: 10.1587/transinf.2021EDL8009
    [27]
    Khan I, Zhang Xianchao, Mobashar R, et al. A literature survey and empirical study of meta-learning for classifier selection[J]. IEEE Access, 2020, 8: 10262−10281 doi: 10.1109/ACCESS.2020.2964726
    [28]
    曾子林,张宏军,张睿,等. 基于元学习思想的算法选择问题综述[J]. 控制与决策,2014,29(6):961−968 doi: 10.13195/j.kzyjc.2013.1297

    Zeng Zilin, Zhang Hongjun, Zhang Rui, et al. Summary of algorithm selection problem based on meta-learning[J]. Control and Decision, 2014, 29(6): 961−968 (in Chinese) doi: 10.13195/j.kzyjc.2013.1297
    [29]
    Rivolli A, Garcia L P F, Soares C, et al. Meta-features for meta-learning[J]. Knowledge-Based Systems, 2022, 240: 108101 doi: 10.1016/j.knosys.2021.108101
    [30]
    Lorena A C, Garcia L P F, Lehmann J, et al. How complex is your classification problem: A survey on measuring classification complexity[J]. ACM Computing Surveys, 2019, 52(5): 1−34
    [31]
    刁兴春,刘艺,曹建军,等. 多目标蚁群优化研究综述[J]. 计算机科学,2017,44(10):7−13,25 doi: 10.11896/j.issn.1002-137X.2017.10.002

    Diao Xingchun, Liu Yi, Cao Jianjun, et al. Reviews of multiobjective ant colony optimization[J]. Computer Science, 2017, 44(10): 7−13,25 (in Chinese) doi: 10.11896/j.issn.1002-137X.2017.10.002
    [32]
    Dua D, Graff C. UCI machine learning repository[EB/OL]. 2017[2022-03-19].https://archive.ics.uci.edu/ml/index.php
    [33]
    Alcalá-Fdez J, Fernández A, Luengo J, et al. Keel data-mining software tool: Data set repository, integration of algorithms and experimental analysis framework[J]. Journal of Multiple-Valued Logic & Soft Computing, 2011, 17: 255−287
    [34]
    Kooperberg C. StatLib: An archive for statistical software, datasets, and information[J]. The American Statistician, 1997, 51(1): 98−98 doi: 10.2307/2684710
    [35]
    Vanschoren J, Van Rijn J N, Bischl B, et al. OpenML: Networked science in machine learning[J]. ACM SIGKDD Explorations Newsletter, 2014, 15(2): 49−60 doi: 10.1145/2641190.2641198
    [36]
    Alcobaça E, Siqueira F, Rivolli A, et al. MFE: Towards reproducible meta-feature extraction[J]. Journal of Machine Learning Research, 2020, 21: 1−5
    [37]
    Pedregosa F, Varoquaux G, Gramfort A, et al. Scikit-learn: Machine learning in Python[J]. Journal of Machine Learning Research, 2011, 12: 2825−2830
    [38]
    Chollet F. Keras[EB/OL]. 2015[2022-07-16].https://keras.io
    [39]
    Brazdil P B, Soares C. Ranking learning algorithms: Using IBL and meta-learning on accuracy and time results[J]. Machine Learning, 2003, 50(3): 251−277 doi: 10.1023/A:1021713901879
    [40]
    李洪奇,徐青松,朱丽萍,等. 基于数据集相似性的分类算法推荐[J]. 计算机应用与软件,2016,33(8):62−66 doi: 10.3969/j.issn.1000-386x.2016.08.014

    Li Hongqi, Xu Qingsong, Zhu Liping, et al. Classification algorithms recommendation based on dataset similarity[J]. Computer Applications and Software, 2016, 33(8): 62−66 (in Chinese) doi: 10.3969/j.issn.1000-386x.2016.08.014
    [41]
    Mirjalili S, Jangir P, Saremi S. Multi-objective ant lion optimizer: A multi-objective optimization algorithm for solving engineering problems[J]. Applied Intelligence, 2017, 46(1): 79−95 doi: 10.1007/s10489-016-0825-8
    [42]
    Deb K, Pratap A, Agarwal S, et al. A fast and elitist multiobjective genetic algorithm: NSGA2[J]. IEEE Transactions on Evolutionary Computation, 2002, 6(2): 182−197 doi: 10.1109/4235.996017
    [43]
    Nebro A J, Durillo J J, Garcia-Nieto J, et al. SMPSO: A new PSO-based metaheuristic for multi-objective optimization[C]//Proc of the 2009 IEEE Symp on Computational Intelligence in Milti-Criteria Decision-Making. Piscataway, NJ: IEEE, 2009: 66−73
    [44]
    Zitzler E, Laumanns M, Thiele L. SPEA2: Improving the strength pareto evolutionary algorithm, 103[R]. Zurich: Swiss Federal Institute of Technology, 2001
    [45]
    Benítez-Hidalgo A, Nebro A J, García-Nieto J, et al. jMetalPy: A Python framework for multi-objective optimization with metaheuristics[J]. Swarm and Evolutionary Computation, 2019, 51: 100598 doi: 10.1016/j.swevo.2019.100598
    [46]
    Goh C-K, Tan K C. A competitive-cooperative coevolutionary paradigm for dynamic multiobjective optimization[J]. IEEE Transactions on Evolutionary Computation, 2009, 13(1): 103−127 doi: 10.1109/TEVC.2008.920671
    [47]
    Zitzler E, Thiele L. Multiobjective evolutionary algorithms: A comparative case study and the strength Pareto approach[J]. IEEE Transactions on Evolutionary Computation, 1999, 3(4): 257−271 doi: 10.1109/4235.797969
    [48]
    Schott J R. Fault tolerant design using single and multicriteria genetic algorithm optimization[D]. Cambridge, MA: Department of Aeronautics and Astronautics, Massachusetts Institute of Technology, 1995
  • Related Articles

    [1]Jia Xibin, Li Chen, Wang Luo, Zhang Muchen, Liu Xiaojian, Zhang Yangyang, Wen Jiakai. A Multimodal Cross-Domain Sentiment Analysis Algorithm Based on Feature Disentanglement Meta-Optimization[J]. Journal of Computer Research and Development. DOI: 10.7544/issn1000-1239.202440624
    [2]Kang Zhao, Liu Liang, Han Meng. Semi-Supervised Classification Based on Transformed Learning[J]. Journal of Computer Research and Development, 2023, 60(1): 103-111. DOI: 10.7544/issn1000-1239.202110811
    [3]Ju Zhuoya, Wang Zhihai. A Bayesian Classification Algorithm Based on Selective Patterns[J]. Journal of Computer Research and Development, 2020, 57(8): 1605-1616. DOI: 10.7544/issn1000-1239.2020.20200196
    [4]Meng Jun, Zhang Jing, Jiang Dingling, He Xinyu, Li Lishuang. Selective Ensemble Classification Integrated with Affinity Propagation Clustering[J]. Journal of Computer Research and Development, 2018, 55(5): 986-993. DOI: 10.7544/issn1000-1239.2018.20170077
    [5]Wang Jun, Wei Jinmao, Zhang Lu. Multi-Task Feature Learning Algorithm Based on Preserving Classification Information[J]. Journal of Computer Research and Development, 2017, 54(3): 537-548. DOI: 10.7544/issn1000-1239.2017.20150963
    [6]Duan Jie, Hu Qinghua, Zhang Lingjun, Qian Yuhua, Li Deyu. Feature Selection for Multi-Label Classification Based on Neighborhood Rough Sets[J]. Journal of Computer Research and Development, 2015, 52(1): 56-65. DOI: 10.7544/issn1000-1239.2015.20140544
    [7]Zhu Jun, Zhao Jieyu, Dong Zhenyu. Image Classification Using Hierarchical Feature Learning Method Combined with Image Saliency[J]. Journal of Computer Research and Development, 2014, 51(9): 1919-1928. DOI: 10.7544/issn1000-1239.2014.20140138
    [8]Li Lina, Ouyang Jihong, Liu Dayou, Gao Wenjie. An Active Collective Classification Method Combing Feature Selection and Link Filtering[J]. Journal of Computer Research and Development, 2013, 50(11): 2349-2357.
    [9]Chen Tieming, Ma Jixia, Samuel H.Huang, Cai Jiamei. Novel and Efficient Method on Feature Selection and Data Classification[J]. Journal of Computer Research and Development, 2012, 49(4): 735-745.
    [10]Jiang Yuan and Zhou Zhihua. A Text Classification Method Based on Term Frequency Classifier Ensemble[J]. Journal of Computer Research and Development, 2006, 43(10): 1681-1687.
  • Cited by

    Periodical cited type(4)

    1. 马良玉,程东炎,梁书源,耿妍竹,段新会. 基于LightGBM-VIF-MIC-SFS的风电机组故障诊断输入特征选择方法. 热力发电. 2024(01): 154-164 .
    2. 王永兴,王彦坤. 基于改进蚁狮算法的智慧赋能工厂装配线任务分配优化. 自动化与仪器仪表. 2024(03): 167-170 .
    3. 韦修喜,彭茂松,黄华娟. 基于多策略改进蝴蝶优化算法的无线传感网络节点覆盖优化. 计算机应用. 2024(04): 1009-1017 .
    4. 刘艺,杨国利,郑奇斌,李翔,周杨森,陈德鹏. 无人系统数据融合流水线架构设计. 计算机应用. 2024(08): 2536-2543 .

    Other cited types(3)

Catalog

    Article views (211) PDF downloads (107) Cited by(7)

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return