BAM: Bottleneck Attention Module

BAM: Bottleneck Attention Module

2018 | Jongchan Park, Sanghyun Woo, Joon-Young Lee, In So Kweon
The paper introduces the Bottleneck Attention Module (BAM), a lightweight and effective attention mechanism designed to enhance the representational power of deep neural networks. BAM is integrated into any feed-forward convolutional neural network (CNN) at the bottlenecks where feature maps are downsampled. It infers an attention map through two separate pathways: channel and spatial, effectively focusing on important elements in the feature maps. The module is trained end-to-end and has minimal computational and parameter overhead. Extensive experiments on various benchmarks, including CIFAR-100, ImageNet-1K, VOC 2007, and MS COCO, demonstrate consistent improvements in classification and detection performance. The authors also conduct ablation studies to validate the effectiveness of BAM and compare it with other attention mechanisms, showing that BAM outperforms them with fewer parameters. The paper concludes by highlighting the hierarchical reasoning process observed in BAM, which aligns with human perception, and suggests that BAM can be a valuable tool for enhancing the performance of various vision tasks.The paper introduces the Bottleneck Attention Module (BAM), a lightweight and effective attention mechanism designed to enhance the representational power of deep neural networks. BAM is integrated into any feed-forward convolutional neural network (CNN) at the bottlenecks where feature maps are downsampled. It infers an attention map through two separate pathways: channel and spatial, effectively focusing on important elements in the feature maps. The module is trained end-to-end and has minimal computational and parameter overhead. Extensive experiments on various benchmarks, including CIFAR-100, ImageNet-1K, VOC 2007, and MS COCO, demonstrate consistent improvements in classification and detection performance. The authors also conduct ablation studies to validate the effectiveness of BAM and compare it with other attention mechanisms, showing that BAM outperforms them with fewer parameters. The paper concludes by highlighting the hierarchical reasoning process observed in BAM, which aligns with human perception, and suggests that BAM can be a valuable tool for enhancing the performance of various vision tasks.
Reach us at info@study.space
Understanding BAM%3A Bottleneck Attention Module