吴微

个人信息Personal Information

教授

博士生导师

硕士生导师

性别:男

毕业院校:英国牛津大学数学所

学位:博士

所在单位:数学科学学院

学科:计算数学

电子邮箱:wuweiw@dlut.edu.cn

扫描关注

论文成果

当前位置: 吴微 >> 科学研究 >> 论文成果

Smooth group L-1/2 regularization for input layer of feedforward neural networks

点击次数:

论文类型:期刊论文

发表时间:2018-11-07

发表刊物:NEUROCOMPUTING

收录刊物:SCIE、Scopus

卷号:314

页面范围:109-119

ISSN号:0925-2312

关键字:Feedforward neural network; Input layer compression; Feature selection; Smooth group L-1/2 regularization; Convergence

摘要:A smooth group regularization method is proposed to identify and remove the redundant input nodes of feedforward neural networks, or equivalently the redundant dimensions of the input data of a given data set. This is achieved by introducing a smooth group L-1/2 regularizer with respect to the input nodes into the error function to drive some weight vectors of the input nodes to zero. The main advantage of the method is that it can remove not only the redundant nodes, but also some redundant weights of the surviving nodes. As a comparison, the L-1 regularization (Lasso) is mainly designed for removing the redundant weights, and it is not very good at removing the redundant nodes. And the group Lasso can remove the redundant nodes, but not any weight of the surviving nodes. Another advantage of the proposed method is that it uses a smooth function to replace the non-smooth absolute value function in the common L-1/2 regularizer, and thus it reduces the oscillation caused by the non-smoothness and enables us to prove the convergence properties of the proposed training algorithm. Numerical simulations are performed to illustrate the efficiency of the algorithm. (C) 2018 Elsevier B.V. All rights reserved.