高级检索
    李 凯, 黄厚宽. 小规模数据集的神经网络集成算法研究[J]. 计算机研究与发展, 2006, 43(7): 1161-1166.
    引用本文: 李 凯, 黄厚宽. 小规模数据集的神经网络集成算法研究[J]. 计算机研究与发展, 2006, 43(7): 1161-1166.
    Li Kai, Huang Houkuan. Study of a Neural Network Ensemble Algorithm for Small Data Sets[J]. Journal of Computer Research and Development, 2006, 43(7): 1161-1166.
    Citation: Li Kai, Huang Houkuan. Study of a Neural Network Ensemble Algorithm for Small Data Sets[J]. Journal of Computer Research and Development, 2006, 43(7): 1161-1166.

    小规模数据集的神经网络集成算法研究

    Study of a Neural Network Ensemble Algorithm for Small Data Sets

    • 摘要: 研究了小样本数据集的神经网络分类器集成,提出了适合于小样本数据集的神经网络分类器集成方法Novel\-NNE,通过生成差异数据提高神经网络集成中个体的差异性,从而提高集成学习的泛化性能;最后应用不同的融合技术针对UCI标准数据集进行了实验研究.结果表明,在集成算法Novel\-NNE中,使用相对多数投票与贝叶斯融合方法的性能优于行为知识空间融合方法.

       

      Abstract: Ensemble learning has become a hot topic in machine learning. It dramatically improves the generalization performance of a classifier. In this paper, neural network ensemble for small data sets is studied and an approach to neural network ensemble (Novel\-NNE) is presented. For increasing ensemble diversity, a diverse data set is generated as part training set in order to create diverse neural network classifiers. Moreover, different combinational methods are studied for Novel\-NNE. Experimental results show that Novel\-NNE for both the relative majority vote method and the Bayes combinational method achieves higher predictive accuracy.

       

    /

    返回文章
    返回