Current position: Home >> Scientific Research >> Paper Publications

Modified gradient-based learning for local coupled feedforward neural networks with Gaussian basis function

Release Time:2019-03-09  Hits:

Indexed by: Journal Article

Date of Publication: 2013-05-01

Journal: NEURAL COMPUTING & APPLICATIONS

Included Journals: Scopus、EI、SCIE

Volume: 22

Issue: SUPPL.1

Page Number: S379-S394

ISSN: 0941-0643

Key Words: Neural networks; LCFNNs; Convergence; Constant learning rate; Gaussian basis function

Abstract: Local coupled feedforward neural networks (LCFNNs) help address the problems of slow convergence and large computation consumption caused by multi-layer perceptrons structurally. This paper presents a modified gradient-based learning algorithm in an attempt to further enhance the capabilities of LCFNNs. Using this approach, an LCFNN can achieve quality generalisation with higher learning efficiency. Theoretical analysis of the convergence property of this algorithm is provided, indicating that the gradient of the error function monotonically decreases and tends to zeros and the weight parameter sequence converges to a minimum of the given error function with respect to the number of learning iterations. Conditions for the use of a constant learning rate in order to guarantee the convergence are also specified. The work is verified with numerical experimental results.

Prev One:A Modified Gradient-Based Neuro-Fuzzy Learning Algorithm for Pi-Sigma Network Based on First-Order Takagi-Sugeno System

Next One:基于Matlab的智能计算课程可视化教学