Hits:
Indexed by:期刊论文
Date of Publication:2016-09-26
Journal:NEUROCOMPUTING
Included Journals:SCIE、EI、Scopus
Volume:207
Page Number:322-334
ISSN No.:0925-2312
Key Words:Extreme learning machine; Online/incremental learning; Concept drift; Regularized optimization method
Abstract:The online sequential extreme learning machine (OS-ELM) algorithm is an on-line and incremental learning method, which can learn data one-by-one or chunk-by-chunk with a fixed or varying chunk size. And OS-ELM achieves the same learning performance as ELM trained by the complete set of data. However, in on-line learning environments, the concepts to be learned may change with time, a feature referred to as concept drift. To use ELMs in such non-stationary environments, a forgetting parameters extreme learning machine (FP-ELM) is proposed in this paper. The proposed FP-ELM can achieve incremental and on-line learning, just like OS-ELM. Furthermore, FP-ELM will assign a forgetting parameter to the previous training data according to the current performance to adapt to possible changes after a new chunk comes. The regularized optimization method is used to avoid estimator windup. Performance comparisons between FP-ELM and two frequently used ensemble approaches are carried out on several regression and classification problems with concept drift. The experimental results show that FP-ELM produces comparable or better performance with lower training time. (C) 2016 Elsevier B.V. All rights reserved.