location: Current position: Home >> Scientific Research >> Paper Publications

Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks

Hits:

Indexed by:期刊论文

Date of Publication:2009-01-01

Journal:DISCRETE DYNAMICS IN NATURE AND SOCIETY

Included Journals:SCIE、Scopus

Volume:2009

ISSN No.:1026-0226

Abstract:The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis. Copyright (C) 2009 Huisheng Zhang et al.

Pre One:英文科技文档识别中数学公式的定位

Next One:A comment on "Relaxed conditions for radial-basis function networks to be universal approximators"