Hits:
Indexed by:会议论文
Date of Publication:2009-10-17
Included Journals:EI、CPCI-S、Scopus
Page Number:2177-2181
Abstract:In this paper, a novel approach for neural network ensemble called Simple Ensemble of Extreme Learning Machine (SE-ELM) is proposed. It is proved theoretically in this study that the generalization ability of an ensemble is determined by the diversity of its components' output space. Therefore SE-ELM regards the diversity of components' output space as a target during the training process. In the first phase, SE-ELM initializes each component with different input weights and analytically determines the output weights through generalized inverse operation of the hidden layer output matrices. The difference among components' input weights forces those components to have different output space thus increasing the diversity of the ensemble. Experiments carried on four real world problems show that SE-ELM not only runs much faster but also presents better generalization performance than some classic ensemble algorithms.