徐敏

个人信息Personal Information

副教授

硕士生导师

性别:男

毕业院校:大连理工大学

学位:博士

所在单位:数学科学学院

学科:计算数学. 应用数学

电子邮箱:wolf_hsu@dlut.edu.cn

扫描关注

论文成果

当前位置: 中文主页 >> 科学研究 >> 论文成果

Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems

点击次数:

论文类型:期刊论文

发表时间:2018-09-01

发表刊物:ANALYSIS AND APPLICATIONS

收录刊物:SCIE

卷号:16

期号:5

页面范围:741-755

ISSN号:0219-5305

关键字:Large-scale optimization; randomized block-coordinate descent; linearly coupled constraints; convergence rate

摘要:The problem of minimizing a separable convex function under linearly coupled constraints arises from various application domains such as economic systems, distributed control, and network flow. The main challenge for solving this problem is that the size of data is very large, which makes usual gradient-based methods infeasible. Recently, Necoara, Nesterov and Glineur [Random block coordinate descent methods for linearly constrained optimization over networks, J. Optim. Theory Appl. 173(1) (2017) 227-254] proposed an efficient randomized coordinate descent method to solve this type of optimization problems and presented an appealing convergence analysis. In this paper, we develop new techniques to analyze the convergence of such algorithms, which are able to greatly improve the results presented in the above. This refined result is achieved by extending Nesterov's second technique [Efficiency of coordinate descent methods on huge-scale optimization problems, SIAM J. Optim. 22 (2012) 341-362] to the general optimization problems with linearly coupled constraints. A novel technique in our analysis is to establish the basis vectors for the subspace of the linear constraints.