杨洁

个人信息Personal Information

副教授

博士生导师

硕士生导师

性别:女

毕业院校:大连理工大学

学位:博士

所在单位:数学科学学院

学科:计算数学

办公地点:大连理工大学数学科学学院505

联系方式:0411-84708351-8205

电子邮箱:yangjiee@dlut.edu.cn

扫描关注

论文成果

当前位置: 中文主页 >> 科学研究 >> 论文成果

Negative effects of sufficiently small initialweights on back-propagation neural networks

点击次数:

论文类型:期刊论文

发表时间:2012-08-01

发表刊物:JOURNAL OF ZHEJIANG UNIVERSITY-SCIENCE C-COMPUTERS & ELECTRONICS

收录刊物:SCIE、EI

卷号:13

期号:8

页面范围:585-592

ISSN号:1869-1951

关键字:Neural networks; Back-propagation; Gradient learning method; Convergence

摘要:In the training of feedforward neural networks, it is usually suggested that the initial weights should be small in magnitude in order to prevent premature saturation. The aim of this paper is to point out the other side of the story: In some cases, the gradient of the error functions is zero not only for infinitely large weights but also for zero weights. Slow convergence in the beginning of the training procedure is often the result of sufficiently small initial weights. Therefore, we suggest that, in these cases, the initial values of the weights should be neither too large, nor too small. For instance, a typical range of choices of the initial weights might be something like (-0.4,-0.1) a(a) (0.1, 0.4), rather than (-0.1, 0.1) as suggested by the usual strategy. Our theory that medium size weights should be used has also been extended to a few commonly used transfer functions and error functions. Numerical experiments are carried out to support our theoretical findings.