Hits:
Indexed by:期刊论文
Date of Publication:2012-09-01
Journal:NEURAL NETWORKS
Included Journals:SCIE、EI、PubMed、Scopus
Volume:33
Page Number:127-135
ISSN No.:0893-6080
Key Words:Weight decay; Backpropagation; Cyclic; Almost cyclic; Convergence
Abstract:Weight decay method as one of classical complexity regularization methods is simple and appears to work well in some applications for backpropagation neural networks (BPNN). This paper shows results for the weak and strong convergence for cyclic and almost cyclic learning BPNN with penalty term (CBP-P and ACBP-P). The convergence is guaranteed under certain relaxed conditions for activation functions, learning rate and under the assumption for the stationary set of error function. Furthermore, the boundedness of the weights in the training procedure is obtained in a simple and clear way. Numerical simulations are implemented to support our theoretical results and demonstrate that ACBP-P has better performance than CBP-P on both convergence speed and generalization ability. (C) 2012 Elsevier Ltd. All rights reserved.