location: Current position: Home >> Scientific Research >> Paper Publications

Nonparametric tensor dictionary learning with beta process priors

Hits:

Indexed by:期刊论文

Date of Publication:2016-12-19

Journal:NEUROCOMPUTING

Included Journals:SCIE、EI、Scopus

Volume:218

Page Number:120-130

ISSN No.:0925-2312

Key Words:Dictionary learning; Beta process; Bayesian inference; Gibbs-sampling

Abstract:Nonparametric Bayesian techniques have been applied to one dimensional dictionary learning using beta process for sparse representation. However, in real world, signals are often high dimensional tensor and have some structured features. In this paper, we extend the nonparametric Bayesian technique to structured tensor dictionary learning under a sparse favouring beta process prior. The hierarchical form of tensor dictionary learning model was presented, and the inference process was given via Gibbs sampling analysis with analytic update equations. The tensor dictionary is learned directly from high dimensional tensor data, so it can make full use of spatial structure information of the original sample data. The employed nonparametric Bayesian technique allows the noise variance to be unknown or non stationary, the cases frequently being seen in many applications. Finally, several experiments on video reconstruction and image denoising are conducted to showcase the application of learned tensor dictionaries. (C) 2016 Elsevier B.V. All rights reserved.

Pre One:Image Set Compression with Content Adaptive Sparse Dictionary

Next One:Internal-Video Mode Dependent Directional Transform