李正学

个人信息Personal Information

副教授

硕士生导师

性别:男

毕业院校:吉林大学

学位:博士

所在单位:数学科学学院

电子邮箱:lizx@dlut.edu.cn

扫描关注

论文成果

当前位置: 中文主页 >> 科学研究 >> 论文成果

Convergence of Gradient Descent Algorithm for Diagonal Recurrent Neural Networks

点击次数:

论文类型:会议论文

发表时间:2007-09-14

收录刊物:EI、CPCI-S、Scopus

页面范围:29-31

摘要:Recurrent neural networks have been used for analysis and prediction of time series. This paper is concerned with the convergence of the gradient descent algorithm for training the diagonal recurrent neural networks. The existing convergence results consider the online gradient training algorithm based on the assumption that a very large number of (or infinitely many in theory) training samples of the time series are available, and accordingly the stochastic process theory is used to establish some convergence results of probability nature. In this paper, we consider the case that only a small number of training samples of the time series are available such that the stochastic treatment of the problem is no longer appropriate. Instead, we use the offline gradient descent algorithm for training the diagonal recurrent neural network, and we accordingly prove some convergence results of deterministic nature. The monotonicity of the error function in the iteration is also guaranteed.