ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks

ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks

7 Apr 2020 | Qilong Wang1, Banggu Wu1, Pengfei Zhu1, Peihua Li2, Wangmeng Zuo3, Qinghua Hu1,*
ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks This paper proposes an Efficient Channel Attention (ECA) module for deep convolutional neural networks (CNNs) that achieves significant performance improvements with minimal model complexity. The ECA module avoids dimensionality reduction and captures local cross-channel interaction through a fast 1D convolution, which is implemented efficiently. The module is designed to be lightweight, with significantly fewer parameters and computational costs compared to existing channel attention modules like SENet. For example, when applied to ResNet-50, the ECA module has 80 parameters and 4.7e-4 GFLOPs, compared to 24.37M parameters and 3.86 GFLOPs for the original ResNet-50. The ECA module achieves a 2.28% improvement in Top-1 accuracy on ImageNet. The ECA module is implemented using a 1D convolution with an adaptive kernel size, which is determined based on the channel dimension. This allows the module to efficiently capture local cross-channel interactions without requiring manual tuning. The module is evaluated on various tasks including image classification, object detection, and instance segmentation, and shows superior performance compared to existing methods. The ECA module is particularly effective in reducing model complexity while maintaining high performance, making it suitable for both deep CNNs and lightweight architectures like MobileNetV2. The ECA module is implemented as a plug-and-play block that can be integrated into existing CNN architectures. It is evaluated on multiple CNN backbones, including ResNet-50, ResNet-101, ResNet-152, and MobileNetV2, and shows consistent performance improvements. The module is also tested on object detection and instance segmentation tasks, demonstrating its effectiveness in various applications. The results show that the ECA module is not only efficient but also effective, achieving competitive performance while maintaining low model complexity. The module is particularly beneficial for applications where computational resources are limited, as it allows for significant performance improvements with minimal overhead.ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks This paper proposes an Efficient Channel Attention (ECA) module for deep convolutional neural networks (CNNs) that achieves significant performance improvements with minimal model complexity. The ECA module avoids dimensionality reduction and captures local cross-channel interaction through a fast 1D convolution, which is implemented efficiently. The module is designed to be lightweight, with significantly fewer parameters and computational costs compared to existing channel attention modules like SENet. For example, when applied to ResNet-50, the ECA module has 80 parameters and 4.7e-4 GFLOPs, compared to 24.37M parameters and 3.86 GFLOPs for the original ResNet-50. The ECA module achieves a 2.28% improvement in Top-1 accuracy on ImageNet. The ECA module is implemented using a 1D convolution with an adaptive kernel size, which is determined based on the channel dimension. This allows the module to efficiently capture local cross-channel interactions without requiring manual tuning. The module is evaluated on various tasks including image classification, object detection, and instance segmentation, and shows superior performance compared to existing methods. The ECA module is particularly effective in reducing model complexity while maintaining high performance, making it suitable for both deep CNNs and lightweight architectures like MobileNetV2. The ECA module is implemented as a plug-and-play block that can be integrated into existing CNN architectures. It is evaluated on multiple CNN backbones, including ResNet-50, ResNet-101, ResNet-152, and MobileNetV2, and shows consistent performance improvements. The module is also tested on object detection and instance segmentation tasks, demonstrating its effectiveness in various applications. The results show that the ECA module is not only efficient but also effective, achieving competitive performance while maintaining low model complexity. The module is particularly beneficial for applications where computational resources are limited, as it allows for significant performance improvements with minimal overhead.
Reach us at info@study.space
[slides] ECA-Net%3A Efficient Channel Attention for Deep Convolutional Neural Networks | StudySpace