Hits:
Indexed by:会议论文
Date of Publication:2011-07-31
Included Journals:EI、CPCI-S、SCIE、Scopus
Page Number:125-132
Abstract:Weight-decay method as one of classical complexity regularizations is simple and appears to work well in some applications for multi-layer perceptron network (MPN). This paper shows results for the weak and strong convergence for cyclic and almost cyclic learning MPN with penalty term (weight-decay). The convergence is guaranteed under some relaxed conditions such as the activation functions, learning rate and the assumption for the stationary set of error function. Furthermore, the boundedness of the weights in the training procedure is obtained in a simple and clear way.