吴微

个人信息Personal Information

教授

博士生导师

硕士生导师

性别:男

毕业院校:英国牛津大学数学所

学位:博士

所在单位:数学科学学院

学科:计算数学

电子邮箱:wuweiw@dlut.edu.cn

扫描关注

论文成果

当前位置: 吴微 >> 科学研究 >> 论文成果

Convergence of batch gradient learning with smoothing regularization and adaptive momentum for neural networks

点击次数:

论文类型:期刊论文

发表时间:2016-03-08

发表刊物:SPRINGERPLUS

收录刊物:SCIE、PubMed、Scopus

卷号:5

期号:1

页面范围:295

ISSN号:2193-1801

关键字:Feedforward neural networks; Adaptive momentum; Smoothing L-1/2 regularization; Convergence

摘要:This paper presents new theoretical results on the backpropagation algorithm with smoothing L-1/2 regularization and adaptive momentum for feedforward neural networks with a single hidden layer, i.e., we show that the gradient of error function goes to zero and the weight sequence goes to a fixed point as n (n is iteration steps) tends to infinity, respectively. Also, our results are more general since we do not require the error function to be quadratic or uniformly convex, and neuronal activation functions are relaxed. Moreover, compared with existed algorithms, our novel algorithm can get more sparse network structure, namely it forces weights to become smaller during the training and can eventually removed after the training, which means that it can simply the network structure and lower operation time. Finally, two numerical experiments are presented to show the characteristics of the main results in detail.