个人信息Personal Information
教授
博士生导师
硕士生导师
性别:男
毕业院校:大连理工大学
学位:博士
所在单位:数学科学学院
办公地点:数学科学学院312
联系方式:0411-84708351-8312
电子邮箱:xtxiao@dlut.edu.cn
2018年《非线性最优化基础》课程信息
发布时间:2020-10-15 点击次数:
考试时间:5月29日10:00-11:40。研究生闭卷,本科生开卷。
参考资料
(1) Convex Optimization: Algorithms and Complexity
(2) Statistical machine learning and convex optimization
(3) Large-scale Machine Learning and Optimization
(4) Convex Optimization Algorithms
成绩评定 听课情况 20% + 课程报告 30% + 期末考试 50%
课程报告说明
每小组(1-2人,自由组合)从下面列表中挑选一篇文献,以文献内容做一个15分钟报告(报告的slides用 latex beamer 制作),每个组员都要参与,文献阅读、slides制作和报告讲演
分组和选题请尽快发送到我的信箱:xtxiao@dlut.edu.cn
课程报告安排
5月11日(周五)7-8节:(1)-(5)
5月15日(周二)3-4节:(6)-(10)
M. Pilanci, M. Wainwright Newton sketch: A near linear-time optimization algorithm with linear-quadratic convergence (10)
M. Gurbuzbalaban, A. Ozdaglar, P. Parrilo On the convergence rate of incremental aggregated gradient algorithms (1)
P. Richtarik, M. Takac Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
S. Ghadimi, G. Lan Accelerated gradient methods for nonconvex nonlinear and stochastic programming (2)
E. Hazan, S. Kale Beyond the Regret Minimization Barrier: Optimal Algorithms for Stochastic Strongly-Convex Optimization
S. Shalev-Shwartz, T. Zhang Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization
Y. Arjevani, S. Shalev-Shwartz, O. Shamir On Lower and Upper Bounds in Smooth and Strongly Convex Optimization (3)
S. Shalev-Shwartz, T. Zhang Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
M. Schmidt, N. Le Roux, F. Bach Minimizing Finite Sums with the Stochastic Average Gradient (4)
O. Fercoq, P. Richtarik Optimization in high dimensions via accelerated, parallel and proximal coordinate descent (5)
R. Gower, P. Richtarik Randomized Iterative Methods for Linear Systems (7)
A. Mokhtari, A. Ribeiro RES: Regularized Stochastic BFGS Algorithm (9)
M. Friedlander, M. Schmidt Hybrid deterministic-stochastic methods for data fitting (6)
Y. Xu, W. Yin Block stochastic gradient iteration for convex and nonconvex optimization (8)