• 更多栏目

    戚金清

    • 副教授       硕士生导师
    • 性别:男
    • 毕业院校:东京工业大学
    • 学位:博士
    • 所在单位:信息与通信工程学院
    • 学科:通信与信息系统. 信号与信息处理
    • 电子邮箱:jinqing@dlut.edu.cn

    访问量:

    开通时间:..

    最后更新时间:..

    Multi-attention guided feature fusion network for salient object detection

    点击次数:

    论文类型:期刊论文

    发表时间:2020-10-21

    发表刊物:NEUROCOMPUTING

    收录刊物:SCIE

    卷号:411

    页面范围:416-427

    ISSN号:0925-2312

    关键字:Salient object detection; Feature fusion; Channel-wise attention; Position attention

    摘要:Though with the rapid development of deep learning, salient object detection methods have achieved increasingly better performance, how to get effective feature representation to predict more accurate sal-iency maps is still a burning problem we need to consider. To overcome this situation, most previous works tend to focus on skip-based architecture to integrate hierarchical information of different scales and layers. However, a simple concatenation of high-level features and low-level features is not all-powerful because cluttered and noisy information can cause negative consequences. Concerning the issue mentioned above, we propose a Multi-Attention guided Feature-fusion network (MAF) which can allevi-ate the problem from two aspects. For one thing, we use a novel Channel-wise Attention Block (CAB) to in charge of message passing layer by layer from a global view, which utilizes the semantic cues in the higher convolutional block to instruct the feature selection in the lower block. For another, a Position Attention Block (PAB) also works on integrated features to model pixel relationships and capture rich contextual dependencies. Under the guidance of multi-attention, discriminative features are selected to conduct a new end-to-end densely supervised encoder-decoder network which detects salient objects more uniformly and precisely. As the experimental results on five benchmark datasets show, our meth-ods perform favorably against other state-of-the-art approaches. (c) 2020 Elsevier B.V. All rights reserved.