李培华

个人信息Personal Information

教授

博士生导师

硕士生导师

性别:男

毕业院校:哈尔滨工业大学

学位:博士

所在单位:信息与通信工程学院

联系方式:http://peihuali.org

电子邮箱:peihuali@dlut.edu.cn

扫描关注

论文成果

当前位置: Official website ... >> 科学研究 >> 论文成果

Evaluation of ground distances and features in EMD-based GMM matching for texture classification

点击次数:

论文类型:期刊论文

发表时间:2016-09-01

发表刊物:PATTERN RECOGNITION

收录刊物:SCIE、EI、Scopus

卷号:57

页面范围:152-163

ISSN号:0031-3203

关键字:Texture classification; Earth Mover's Distance; Gaussian mixture models; Ground distances; Image features

摘要:Recently, the Earth Mover's Distance (EMD) has demonstrated its superiority in Gaussian mixture models (GMMs) based texture classification. The ground distances between Gaussian components of GMMs have great influences on performance of GMM matching, which however, has not been fully studied yet. Meanwhile, image features play a key role in image classification task, and often greatly impact classification performance. In this paper, we present a comprehensive study of ground distances and image features in texture classification task. We divide existing ground distances into statistics based ones and Riemannian manifold based ones. We make a theoretical analysis of the differences and relationships among these ground distances. Inspired by Gaussian embedding distance and product of Lie Groups distance, we propose an improved Gaussian embedding distance to compare Gaussians. We also evaluate for the first time the image features for GMM matching, including the handcrafted features such as Gabor filter, Local Binary Pattern (LBP) descriptor, SIFT, covariance descriptor and high-level features extracted by deep convolution networks. The experiments are conducted on three texture databases, i.e., KTH-TIPS-2b, FMD and UIUC. Based on experimental results, we show that the uses of geometrical structure and balance strategy are critical to ground distances. The experimental results show that GMM with the proposed ground distance can achieve state-of-the-art performance when high-level features are exploited. (C) 2016 Elsevier Ltd. All rights reserved.