Hits:
Indexed by:期刊论文
Date of Publication:2014-10-01
Journal:IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
Included Journals:SCIE、EI、Scopus、ESI高被引论文
Volume:25
Issue:10
Page Number:1828-1841
ISSN No.:2162-237X
Key Words:Extreme learning machine (ELM); parsimonious model selection; recursive orthogonal least squares (ROLS); sequential partial orthogonalization (SPO); single hidden-layer feedforward network (SLFN)
Abstract:Novel constructive and destructive parsimonious extreme learning machines (CP- and DP-ELM) are proposed in this paper. By virtue of the proposed ELMs, parsimonious structure and excellent generalization of multiinput-multioutput single hidden-layer feedforward networks (SLFNs) are obtained. The proposed ELMs are developed by innovative decomposition of the recursive orthogonal least squares procedure into sequential partial orthogonalization (SPO). The salient features of the proposed approaches are as follows: 1) Initial hidden nodes are randomly generated by the ELM methodology and recursively orthogonalized into an upper triangular matrix with dramatic reduction in matrix size; 2) the constructive SPO in the CP-ELM focuses on the partial matrix with the subcolumn of the selected regressor including nonzeros as the first column while the destructive SPO in the DP-ELM operates on the partial matrix including elements determined by the removed regressor; 3) termination criteria for CP- and DP-ELM are simplified by the additional residual error reduction method; and 4) the output weights of the SLFN need not be solved in the model selection procedure and is derived from the final upper triangular equation by backward substitution. Both single-and multi-output real-world regression data sets are used to verify the effectiveness and superiority of the CP- and DP-ELM in terms of parsimonious architecture and generalization accuracy. Innovative applications to nonlinear time-series modeling demonstrate superior identification results.