location: Current position: Home >> Scientific Research >> Paper Publications

Solving Partial Least Squares Regression via Manifold Optimization Approaches

Hits:

Indexed by:期刊论文

Date of Publication:2019-02-01

Journal:IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS

Included Journals:SCIE、Scopus

Volume:30

Issue:2

Page Number:588-600

ISSN No.:2162-237X

Key Words:Classification; Grassmann manifolds; Partial least squares regression (PLSR); Riemannian conjugate gradient method; Riemannian manifolds; Stiefel manifolds

Abstract:Partial least squares regression (PLSR) has been a popular technique to explore the linear relationship between two data sets. However, all existing approaches often optimize a PLSR model in Euclidean space and take a successive strategy to calculate all the factors one by one for keeping the mutually orthogonal PLSR factors. Thus, a suboptimal solution is often generated. To overcome the shortcoming, this paper takes statistically inspired modification of PLSR (SIMPLSR) as a representative of PLSR, proposes a novel approach to transform SIMPLSR into optimization problems on Riemannian manifolds, and develops corresponding optimization algorithms. These algorithms can calculate all the PLSR factors simultaneously to avoid any suboptimal solutions. Moreover, we propose sparse SIMPLSR on Riemannian manifolds, which is simple and intuitive. A number of experiments on classification problems have demonstrated that the proposed models and algorithms can get lower classification error rates compared with other linear regression methods in Euclidean space. We have made the experimental code public at https://github.com/Haoran2014.

Pre One:DRFN: Deep Recurrent Fusion Network for Single-Image Super-Resolution With Large Factors

Next One:Deep Recurrent Fusion Network for Single Image Super-Resolution with Large Factors