李培华

个人信息Personal Information

教授

博士生导师

硕士生导师

性别:男

毕业院校:哈尔滨工业大学

学位:博士

所在单位:信息与通信工程学院

联系方式:http://peihuali.org

电子邮箱:peihuali@dlut.edu.cn

扫描关注

论文成果

当前位置: Official website ... >> 科学研究 >> 论文成果

Weighted and Class-Specific Maximum Mean Discrepancy for Unsupervised Domain Adaptation

点击次数:

论文类型:期刊论文

发表时间:2021-01-10

发表刊物:IEEE TRANSACTIONS ON MULTIMEDIA

卷号:22

期号:9

页面范围:2420-2433

ISSN号:1520-9210

关键字:Measurement; Adaptation models; Airplanes; Gallium nitride; Task analysis; Generative adversarial networks; Degradation; Image recognition; unsupervised domain adaption; convolutional neural network; expectation-maximization algorithms

摘要:Although maximum mean discrepancy (MMD) has achieved great success in unsupervised domain adaptation (UDA), most of existing UDA methods ignore the issue of class weight bias across domains, which is ubiquitous and evidently gives rise to the degradation of UDA performance. In this work, we propose two improved MMD metrics, i.e., weighted MMD (WMMD) and class-specific MMD (CMMD), to alleviate the adverse effect caused by the changes of class prior distributions between source and target domains. In WMMD, class-specific auxiliary weights are deployed to reweigh the source samples. In CMMD, we calculate the MMD for each class of source and target samples. Since the class labels of target samples are unknown for UDA problem, we present a classification expectation-maximization algorithm to estimate the pseudo-labels of target samples on the fly and update the model parameters using estimated labels. The proposed methods can be flexibly incorporated into deep convolutional neural networks to form WMMD and CMMD based domain adaptation networks, which we called WDAN and CDAN, respectively. By combining WMMD with CMMD, we present a CWMMD based domain adaptation network (CWDAN) to further improve classification performance. Experiments show that, both WMMD and CMMD benefit the classification accuracy, and our CWDAN can achieve compelling UDA performance in comparison with MMD and the state-of-the-art UDA methods.