张宏伟

个人信息Personal Information

教授

博士生导师

硕士生导师

性别:男

毕业院校:大连理工大学

学位:博士

所在单位:数学科学学院

电子邮箱:hwzhang@dlut.edu.cn

扫描关注

论文成果

当前位置: 中文主页 >> 科学研究 >> 论文成果

Multi-view metric learning based on KL-divergence for similarity measurement

点击次数:

论文类型:期刊论文

发表时间:2017-05-17

发表刊物:NEUROCOMPUTING

收录刊物:SCIE、EI、Scopus

卷号:238

页面范围:269-276

ISSN号:0925-2312

关键字:Multi-view metric learning; KL-divergence; Distance metric learning; Multi-view features

摘要:In the past decades, we have witnessed a surge of interests of learning distance metrics for various image processing tasks. However, facing with features from multiple views, most metric learning methods fail to integrate compatible and complementary information from multi-view features to train a common distance metric. Most information is thrown away by those single-view methods, which affects their performances severely. Therefore, how to fully exploit information from multiple views to construct an optimal distance metric is of vital importance but challenging. To address this issue, this paper constructs a multi-view metric learning method which utilizes KL-divergences to integrate features from multiple views. Minimizing KL-divergence between features from different views can lead to the consistency of multiple views, which enables MML to exploit information from multiple views. Various experiments on several benchmark multi-view datasets have verified the excellent performance of this novel method. (C) 2017 Elsevier B.V. All rights reserved.