个人信息Personal Information
副教授
博士生导师
硕士生导师
性别:女
毕业院校:大连理工大学
学位:博士
所在单位:数学科学学院
学科:计算数学
办公地点:大连理工大学创新园大厦B1405
联系方式:0411-84708351-8205
电子邮箱:yangjiee@dlut.edu.cn
用于神经网络权值稀疏化的L_(1/2)正则化方法
点击次数:
发表时间:2015-01-01
发表刊物:中国科学 数学
所属单位:数学科学学院
卷号:45
期号:9
页面范围:1487-1504
ISSN号:1674-7216
摘要:On the premise of appropriate learning accuracy, the number of the
neurons of a neural network should be as less as possible
(constructional sparsification), so as to reduce the cost, and to
improve the robustness and the generalization accuracy. We study the
constructional sparsification of feedforward neural networks by using
regularization methods. Apart from the traditional L1 regularization for
sparsification, we mainly use the L_(1/2) regularization. To remove the
oscillation in the iteration process due to the nonsmoothness of the
L_(1/2) regularizer, we propose to smooth it in a neighborhood of the
nonsmooth point to get a smoothing L_(1/2) regularizer. By doing so, we
expect to improve the efficiency of the L_(1/2) regularizer so as to
surpass the L1 regularizer. Some of our recent works in this respect are
summarized in this paper, including the works on BP feedforward neural
networks, higher order neural networks, double parallel neural networks
and Takagi-Sugeno fuzzy models.
备注:新增回溯数据