• 更多栏目
    • 语种

    徐易

    • 教授     博士生导师   硕士生导师
    • 性别:男
    • 毕业院校:爱荷华大学
    • 学位:博士
    • 所在单位:控制科学与工程学院
    • 办公地点:海山楼B1613
    • 电子邮箱:yxu@dlut.edu.cn

    访问量:

    开通时间:..

    最后更新时间:..

    个人简介

    徐易,博士,教授,博士生导师,入选国家级青年人才计划。毕业于美国爱荷华大学(The Unversity of Iowa)计算机科学系,获计算机科学专业博士学位,导师是Prof. Tianbao Yang和Prof. Qihang Lin。硕士毕业于美国南达科他州立大学,本科毕业于浙江大学数学系。加入大连理工大学之前,曾任阿里巴巴达摩院算法专家。


    主要研究方向为机器学习、机器视觉、数据驱动的人工智能、统计学习理论、优化。


    课题组常年招收学生,欢迎对人工智能具有浓厚兴趣的学生联系我(yxu AT dlut DOT edu DOT cn),组内经费充足,资源丰富推免/考研招生专业为 人工智能(电子信息与电气工程学部);招收2024年秋季入学博士生(电子信息与电气工程学部 人工智能 专业)。长期招收本科实习生、博士后


    发表论文: (全部论文见谷歌学术:https://scholar.google.com/citations?user=D4jEMqEAAAAJ&hl=en)

    • Yang Yang, Yuxuan Zhang, Xin Song, Yi Xu. Not All Out-of-Distribution Data Are Harmful to Open-Set Active Learning. Accepted to NeurIPS 2023(CCF A)

    • Yixiu Mao, Hongchang Zhang, Chen Chen, Yi Xu, Xiangyang Ji. Supported Value Regularization for Offline Reinforcement Learning. Accepted to NeurIPS 2023(CCF A)

    • Yixiu Mao, Hongchang Zhang, Chen Chen, Yi Xu, Xiangyang Ji. Supported Trust Region Optimization for Offline Reinforcement Learning. ICML 2023(CCF A)

    • Ziquan Liu, Yi Xu#, Xiangyang Ji, Antoni Chan. TWINS: A Fine-Tuning Framework for Improved Transferability of Adversarial Robustness and Generalization. CVPR 2023(CCF A)

    • Hongchang Zhang, Yixiu Mao, Boyuan Wang, Shuncheng He, Yi Xu, Xiangyang Ji. In-sample Actor Critic for Offline Reinforcement Learning. ICLR 2023

    • Ziquan Liu, Yi Xu#, Yuanhong Xu, Qi Qian, Hao Li, Xiangyang Ji, Antoni Chan, Rong Jin. Improved Fine-Tuning by Better Leveraging Pre-Training Data. NeurIPS 2022(CCF A)

    • Zhiwu Qing, Shiwei Zhang, Ziyuan Huang, Yi Xu, Xiang Wang, Mingqian Tang, Changxin Gao, Rong Jin, Nong Sang. Learning from Untrimmed Videos: Self-Supervised Video Representation Learning with Hierarchical Consistency. CVPR 2022(CCF A)

    • Zejiang Hou, Minghai Qin, Fei Sun, Xiaolong Ma, Kun Yuan, Yi Xu, Yen-Kuang Chen, Rong Jin, Yuan Xie, Sun-Yuan Kung. CHEX: CHannel EXploration for CNN Model Compression. CVPR 2022(CCF A)

    • Xiaolong Ma, Minghai Qin, Fei Sun, Zejiang Hou, Kun Yuan, Yi Xu, Yanzhi Wang, Yen-Kuang Chen, Rong Jin, and Yuan Xie. Effective Model Sparsification by Scheduled Grow-and-Prune Methods. ICLR 2022.

    • Qi Qi, Zhishuai Guo, Yi Xu, Rong Jin, Tianbao Yang. An Online Method for A Class of Distributionally Robust Optimization with Non-convex Objectives. NeurIPS 2021(CCF A)

    • Yi Xu, Lei Shang, Jinxing Ye, Qi Qian, Yu-Feng Li, Baigui Sun, Hao Li, Rong Jin. Dash: Semi-Supervised Learning with Dynamic Thresholding. ICML 2021. (Long Talk, acceptance rate: 3%(CCF A)

    • Zhuoning Yuan*, Zhishuai Guo*, Yi Xu, Yiming Ying, Tianbao Yang. (*equal contribution) Federated Deep AUC Maximization for Heterogeneous Data with a Constant Communication Complexity. ICML 2021(CCF A)

    • Yan Yan, Yi Xu, Qihang Lin, Wei Liu, Tianbao Yang. Optimal Epoch Stochastic Gradient Descent Ascent Methods for Min-Max Optimization. NeurIPS 2020(CCF A)

    • Yan Yan, Yi Xu, Lijun Zhang, Xiaoyu Wang, Tianbao Yang. Stochastic Optimization for Non-convex Inf-Projection Problems. ICML 2020(CCF A)

    • Yi Xu, Shenghuo Zhu, Sen Yang, Chi Zhang, Rong Jin, Tianbao Yang. Learning with Non-Convex Truncated Losses by SGD. UAI 2020(CCF B)

    • Yi Xu, Rong Jin, Tianbao Yang. Non-asymptotic Analysis of Stoc hastic Methods for Non-Smooth Non-Convex Regularized Problems. NeurIPS 2019. (CCF A)

    • Yi Xu, Zhuoning Yuan, Sen Yang, Rong Jin, Tianbao Yang. On the Convergence of (Stochastic) Gradient Descent with Extrapolation for Non-Convex Minimization. IJCAI 2019(CCF A)

    • Yi Xu, Qi Qi, Qihang Lin, Rong Jin, Tianbao Yang. Stochastic Optimization for DC Functions and Non-smooth Non-convex Regularizers with Nonasymptotic Convergence. ICML 2019(CCF A)

    • Zaiyi Chen, Yi Xu, Haoyuan Hu, Tianbao Yang. Katalyst: Boosting Convex Katayusha for Non-Convex Problems with a Large Condition Number. ICML 2019(CCF A)

    • Yi Xu, Rong Jin, Tianbao Yang. First-order Stochastic Algorithms for Escaping From Saddle Points in Almost Linear Time. NeurIPS 2018(CCF A)

    • Zaiyi Chen*, Yi Xu*, Enhong Chen, Tianbao Yang. (*equal contribution) SadaGrad: Strongly Adaptive Stochastic Gradient Methods. ICML 2018(CCF A)

    • Yi Xu, Qihang Lin, Tianbao Yang. Adaptive SVRG Methods under Error Bound Conditions with Unknown Growth Parameter. NeurIPS 2017(CCF A)

    • Yi Xu, Mingrui Liu, Qihang Lin, Tianbao Yang. ADMM without a Fixed Penalty Parameter: Faster Convergence with New Adaptive Penalization. NeurIPS 2017(CCF A)

    • Yi Xu, Qihang Lin, Tianbao Yang. Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence. ICML 2017(CCF A)

    • Yi Xu, Haiqing Yang, Lijun Zhang, Tianbao Yang. Efficient Non-oblivious Randomized Reduction for Risk Minimization with Improved Excess Risk Guarantee. AAAI 2017(CCF A)

    • Yi Xu*, Yan Yan*, Qihang Lin, Tianbao Yang. (*equal contribution) Homotopy Smoothing for Non-Smooth Problems with Lower Complexity than O(1/ϵ). NeurIPS 2016(CCF A)

    • Zhiwu Qing, Shiwei Zhang, Ziyuan Huang, Yi Xu, Xiang Wang, Mingqian Tang, Changxin Gao, Rong Jin, Nong Sang. Self-Supervised Learning from Untrimmed Videos via Hierarchical Consistency. Accepted to TPAMI 2023(CCF A)

    • Qi Qi, Yi Xu, Rong Jin, Wotao Yin, Tianbao Yang. Attentional Biased Stochastic Gradient Descent. Accepted to TMLR 2023.

    • Yi Xu, Qihang Lin, Tianbao Yang. Accelerate Stochastic Subgradient Method by Leveraging Local Growth Condition. Analysis and Applications. Vol. 17, No. 05, pp. 773-818, 2019