卢湖川

个人信息Personal Information

教授

博士生导师

硕士生导师

主要任职:未来技术学院/人工智能学院执行院长

性别:男

毕业院校:大连理工大学

学位:博士

所在单位:信息与通信工程学院

学科:信号与信息处理

办公地点:大连理工大学创新园大厦A426

联系方式:****

电子邮箱:lhchuan@dlut.edu.cn

扫描关注

论文成果

当前位置: 中文主页 >> 科学研究 >> 论文成果

Deep gated attention networks for large-scale street-level scene segmentation

点击次数:

论文类型:期刊论文

发表时间:2019-04-01

发表刊物:PATTERN RECOGNITION

收录刊物:SCIE、Scopus

卷号:88

页面范围:702-714

ISSN号:0031-3203

关键字:Scene segmentation; Fully convolutional network; Spatial gated attention; Street-level image understanding

摘要:Street-level scene segmentation aims to label each pixel of street-view images into specific semantic categories. It has been attracting growing interest due to various real-world applications, especially in the area of autonomous driving. However, this pixel-wise labeling task is very challenging under the complex street-level scenes and large-scale object categories. Motivated by the scene layout of street-view images, in this work we propose a novel Spatial Gated Attention (SGA) module, which automatically highlights the attentive regions for pixel-wise labeling, resulting in effective street-level scene segmentation. The proposed module takes as input the multi-scale feature maps based on a Fully Convolutional Network (FCN) backbone, and produces the corresponding attention mask for each feature map. The learned attention masks can neatly highlight the regions of interest while suppress background clutter. Furthermore, we propose an efficient multi-scale feature interaction mechanism which is able to adaptively aggregate the hierarchical features. Based on the proposed mechanism, the features of different levels are adaptively re-weighted according to the local spatial structure and the surrounding contextual information. Consequently, the proposed modules are able to boost standard FCN architectures and result in an enhanced pixel-wise segmentation for street-level scene images. Extensive experiments on three public available street-level benchmarks demonstrate that the proposed Gated Attention Network (GANet) approach achieves consistently superior performance and outperforms the very recent state-of-the-art methods. (C) 2018 Elsevier Ltd. All rights reserved.