王洪玉

个人信息Personal Information

教授

博士生导师

硕士生导师

性别:男

毕业院校:天津大学

学位:博士

所在单位:信息与通信工程学院

学科:通信与信息系统. 信号与信息处理

办公地点:大连理工大学创新园大厦B510

联系方式:电子邮箱:whyu@dlut.edu.cn 办公电话:0411-84707675 移动电话:13842827170

电子邮箱:whyu@dlut.edu.cn

扫描关注

论文成果

当前位置: 中文主页 >> 科学研究 >> 论文成果

Deep Supervised and Contractive Neural Network for SAR Image Classification

点击次数:

论文类型:期刊论文

发表时间:2017-04-01

发表刊物:IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING

收录刊物:SCIE、EI、Scopus

卷号:55

期号:4

页面范围:2442-2459

ISSN号:0196-2892

关键字:Contractive autoencoder (AE); deep neural network (DNN); supervised classification; synthetic aperture radar (SAR) image

摘要:The classification of a synthetic aperture radar (SAR) image is a significant yet challenging task, due to the presence of speckle noises and the absence of effective feature representation. Inspired by deep learning technology, a novel deep supervised and contractive neural network (DSCNN) for SAR image classification is proposed to overcome these problems. In order to extract spatial features, a multiscale patch-based feature extraction model that consists of gray level-gradient co-occurrence matrix, Gabor, and histogram of oriented gradient descriptors is developed to obtain primitive features from the SAR image. Then, to get discriminative representation of initial features, the DSCNN network that comprises four layers of supervised and contractive autoencoders is proposed to optimize features for classification. The supervised penalty of the DSCNN can capture the relevant information between features and labels, and the contractive restriction aims to enhance the locally invariant and robustness of the encoding representation. Consequently, the DSCNN is able to produce effective representation of sample features and provide superb predictions of the class labels. Moreover, to restrain the influence of speckle noises, a graph-cut-based spatial regularization is adopted after classification to suppress misclassified pixels and smooth the results. Experiments on three SAR data sets demonstrate that the proposed method is able to yield superior classification performance compared with some related approaches.