location: Current position: Home >> Scientific Research >> Paper Publications

Fast optimization algorithm on Riemannian manifolds and its application in low-rank learning

Hits:

Indexed by:期刊论文

Date of Publication:2018-05-24

Journal:NEUROCOMPUTING

Included Journals:SCIE、EI、Scopus

Volume:291

Page Number:59-70

ISSN No.:0925-2312

Key Words:Fast optimization algorithm; Riemannian manifolds; Low-rank matrix variety; Low-rank representation; Subspace pursuit; Augmented Lagrange method; Clustering

Abstract:The paper proposes a first-order fast optimization algorithm on Riemannian manifolds (FOA) to address the problem of speeding up optimization algorithms for a class of composite functions on Riemannian manifolds. The theoretical analysis for FOA shows that the algorithm achieves the optimal rate of convergence for function values sequence. The experiments on the matrix completion task show that FOA has better performance than other existing first-order optimization methods on Riemannian manifolds. A subspace pursuit method (SP-RPRG(ALM)) based on FOA is also proposed to solve the low-rank representation model with the augmented Lagrange method (ALM) on the low-rank matrix variety. Experimental results on synthetic data and public databases are presented to demonstrate that both FOA and SP-RPRG (ALM) can achieve superior performance in terms of faster convergence and higher accuracy. We have made the experimental code public at https://github.com/Haoran2014. (c) 2018 Elsevier B.V. All rights reserved.

Pre One:Vectorial Dimension Reduction for Tensors Based on Bayesian Inference

Next One:Low Rank Representation on SPD matrices with Log-Euclidean metric