期刊论文
Wang, Jian
Wang, J (reprint author), Dalian Univ Technol, Sch Math Sci, Dalian 116024, Peoples R China.
Yang, Jie,Wu, Wei
2011-08-01
IEEE TRANSACTIONS ON NEURAL NETWORKS
SCIE、EI、PubMed
J
22
8
1297-1306
1045-9227
Almost-cyclic; backpropagation; convergence; cyclic; feedforward neural networks; momentum
Two backpropagation algorithms with momentum for feedforward neural networks with a single hidden layer are considered. It is assumed that the training samples are supplied to the network in a cyclic or an almost-cyclic fashion in the learning procedure, i.e., in each training cycle, each sample of the training set is supplied in a fixed or a stochastic order respectively to the network exactly once. A restart strategy for the momentum is adopted such that the momentum coefficient is set to zero at the beginning of each training cycle. Corresponding weak and strong convergence results are then proved, indicating that the gradient of the error function goes to zero and the weight sequence goes to a fixed point, respectively. The convergence conditions on the learning rate, the momentum coefficient, and the activation functions are much relaxed compared with those of the existing results.