Abstract:
The multimodal particle swarm optimization algorithm is proposed for neural network ensemble. In every iteration, particles are dynamically partitioned into several clusters, and then the difference between the best particles of clusters in output space is calculated. When the best particles of the two clusters have little difference, the two clusters are combined. Therefore, finally several clusters with much difference not only in network weight space but also in output space are achieved. Each cluster is responsible for one component network training. The best particle in a cluster corresponds to a component network. The number of component networks in the ensemble is automatically determined. The proposed method can effectively control the diversity of the networks. Compared with evolutionary ensembles with negative correlation learning algorithm, bagging and boosting, the experimental results show that the proposed method greatly improves the generalization ability of the neural network ensemble.