8TodwGYosITpmjxRT5FZm1rTWa7Pgbxz1E5MHfNDCjVPHq1zGqwKlNpnve5V
Current position: Home >> Scientific Research >> Paper Publications

Convergence of gradient method with momentum for back-propagation neural networks

Release Time:2019-03-10  Hits:

Indexed by: Journal Article

Date of Publication: 2008-07-01

Journal: JOURNAL OF COMPUTATIONAL MATHEMATICS

Included Journals: SCIE

Volume: 26

Issue: 4

Page Number: 613-623

ISSN: 0254-9409

Key Words: back-propagation (BP) neural networks; gradient method; momentum; convergence

Abstract: In this work, a gradient method with momentum for BP neural networks is considered. The momentum coefficient is chosen in an adaptive manner to accelerate and stabilize the learning procedure of the network weights. Corresponding convergence results are proved.

Prev One:Convergence of gradient method for Elman networks

Next One:Boundedness of a Batch Gradient Method with Penalty for Feedforward Nerual Networks