location: Current position: Home >> Scientific Research >> Paper Publications

Modified gradient-based learning for local coupled feedforward neural networks with Gaussian basis function

Hits:

Indexed by:期刊论文

Date of Publication:2013-05-01

Journal:NEURAL COMPUTING & APPLICATIONS

Included Journals:SCIE、EI、Scopus

Volume:22

Issue:SUPPL.1

Page Number:S379-S394

ISSN No.:0941-0643

Key Words:Neural networks; LCFNNs; Convergence; Constant learning rate; Gaussian basis function

Abstract:Local coupled feedforward neural networks (LCFNNs) help address the problems of slow convergence and large computation consumption caused by multi-layer perceptrons structurally. This paper presents a modified gradient-based learning algorithm in an attempt to further enhance the capabilities of LCFNNs. Using this approach, an LCFNN can achieve quality generalisation with higher learning efficiency. Theoretical analysis of the convergence property of this algorithm is provided, indicating that the gradient of the error function monotonically decreases and tends to zeros and the weight parameter sequence converges to a minimum of the given error function with respect to the number of learning iterations. Conditions for the use of a constant learning rate in order to guarantee the convergence are also specified. The work is verified with numerical experimental results.

Pre One:The binary output units of neural network

Next One:An iterative algorithm for motif discovery