TechRxiv
GCA.pdf (1.63 MB)

GCA: A Graph-based Channel Attention Module for Convolutional Neural Networks

Download (1.63 MB)
preprint
posted on 2023-06-28, 03:41 authored by Mingjun YinMingjun Yin, Zhiyong ChangZhiyong Chang

Convolutional neural networks (CNNs) have achieved great success in many computer vision tasks. Recently, attention mechanisms have proven to be critical in lifting the performance of deep convolutional neural networks (CNNs). In this work, we investigate effective attention mechanisms and propose a novel network unit, which we call the “Graph Channel Attention” (GCA) block, that dynamically encourages the communication across channels by explicitly modelling the cross-channel interaction. The unit is designed with a graph channel attention mechanism in two steps, where the first step captures position information from different spatial locations through a position prior module and the second step enables channels to exchange information via a message exchange module. The proposed GCA block is efficient and can be easily plugged into existing deep neural networks. Extensive experiments show that the proposed method attains superior performance compared with other lightweight attention mechanisms on image classification, object detection, instance segmentation etc. In particular, the proposed method achieves a 4.12% top-1 accuracy improvement on ImageNet classification over a ResNet50 backbone. Furthermore, detailed analyses show that the proposed method could significantly reduce redundancy in features and learn more diverse feature representations.

History

Email Address of Submitting Author

mingjuny1@student.unimelb.edu.au

Submitting Author's Institution

The University of Melbourne

Submitting Author's Country

  • China

Usage metrics

    Licence

    Exports