李培华

个人信息Personal Information

教授

博士生导师

硕士生导师

性别:男

毕业院校:哈尔滨工业大学

学位:博士

所在单位:信息与通信工程学院

联系方式:http://peihuali.org

电子邮箱:peihuali@dlut.edu.cn

扫描关注

论文成果

当前位置: Official website ... >> 科学研究 >> 论文成果

High-Order Local Pooling and Encoding Gaussians Over a Dictionary of Gaussians

点击次数:

论文类型:期刊论文

第一作者:Li, Peihua

通讯作者:Li, PH (reprint author), Dalian Univ Technol, Sch Informat & Commun Engn, Dalian 116024, Peoples R China.

合写作者:Zeng, Hui,Wang, Qilong,Shiu, Simon C. K.,Zhang, Lei

发表时间:2017-07-01

发表刊物:IEEE TRANSACTIONS ON IMAGE PROCESSING

收录刊物:SCIE、EI、Scopus

卷号:26

期号:7

页面范围:3372-3384

ISSN号:1057-7149

关键字:Image classification; high-order local pooling (HO-LP); manifold of Gaussians

摘要:Local pooling (LP) in configuration (feature) space proposed by Boureau et al. explicitly restricts similar features to be aggregated, which can preserve as much discriminative information as possible. At the time it appeared, this method combined with sparse coding achieved competitive classification results with only a small dictionary. However, its performance lags far behind the state-of-the-art results as only the zero-order information is exploited. Inspired by the success of high-order statistical information in existing advanced feature coding or pooling methods, we make an attempt to address the limitation of LP. To this end, we present a novel method called high-order LP (HO-LP) to leverage the information higher than the zero-order one. Our idea is intuitively simple: we compute the first-and second-order statistics per configuration bin and model them as a Gaussian. Accordingly, we employ a collection of Gaussians as visual words to represent the universal probability distribution of features from all classes. Our problem is naturally formulated as encoding Gaussians over a dictionary of Gaussians as visual words. This problem, however, is challenging since the space of Gaussians is not a Euclidean space but forms a Riemannian manifold. We address this challenge by mapping Gaussians into the Euclidean space, which enables us to perform coding with common Euclidean operations rather than complex and often expensive Riemannian operations. Our HO-LP preserves the advantages of the original LP: pooling only similar features and using a small dictionary. Meanwhile, it achieves very promising performance on standard benchmarks, with either conventional, hand-engineered features or deep learning-based features.