location: Current position: Home >> Scientific Research >> Paper Publications

Vectorial Dimension Reduction for Tensors Based on Bayesian Inference

Hits:

Indexed by:期刊论文

Date of Publication:2018-10-01

Journal:IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS

Included Journals:SCIE

Volume:29

Issue:10

Page Number:4579-4592

ISSN No.:2162-237X

Key Words:Bayesian inference; dimension reduction; face recognition; principal component analysis (PCA); tensor decomposition

Abstract:Dimension reduction for high-order tensors is a challenging problem. In conventional approaches, dimension reduction for higher order tensors is implemented via Tucker decomposition to obtain lower dimensional tensors. This paper introduces a probabilistic vectorial dimension reduction model for tensorial data. The model represents a tensor by using a linear combination of the same order basis tensors, thus it offers a learning approach to directly reduce a tensor to a vector. Under this expression, the projection base of the model is based on the tensor CandeComp/PARAFAC (CP) decomposition and the number of free parameters in the model only grows linearly with the number of modes rather than exponentially. A Bayesian inference has been established via the variational Expectation Maximization (EM) approach. A criterion to set the parameters (a factor number of CP decomposition and the number of extracted features) is empirically given. The model outperforms several existing principal component analysis-based methods and CP decomposition on several publicly available databases in terms of classification and clustering accuracy.

Pre One:Localized LRR on Grassmann Manifold: An Extrinsic View

Next One:Fast optimization algorithm on Riemannian manifolds and its application in low-rank learning