ResNeSt: Split-Attention Networks

ResNeSt: Split-Attention Networks

30 Dec 2020 | Hang Zhang, Chongruo Wu, Zhongyue Zhang, Yi Zhu, Haibin Lin, Zhi Zhang, Yue Sun, Tong He, Jonas Mueller, R. Manmatha, Mu Li, Alexander Smola
The paper introduces ResNeSt, a modularized architecture that combines channel-wise attention with multi-path representation to enhance feature interaction and diversity. The design results in a simple and unified computation block, the Split-Attention Block, which can be parameterized using a few variables. ResNeSt outperforms EfficientNet in accuracy and latency trade-offs on image classification and achieves superior transfer learning results on several benchmarks, including object detection, instance segmentation, and semantic segmentation. The source code and pre-trained models are publicly available. The paper also discusses related work on CNN architectures, multi-path and featuremap attention, and neural architecture search. The Split-Attention Block is detailed, including its implementation and comparison with existing attention methods. The training strategies and ablation studies are described, and the results on ImageNet and various benchmarks are presented, demonstrating the effectiveness of ResNeSt.The paper introduces ResNeSt, a modularized architecture that combines channel-wise attention with multi-path representation to enhance feature interaction and diversity. The design results in a simple and unified computation block, the Split-Attention Block, which can be parameterized using a few variables. ResNeSt outperforms EfficientNet in accuracy and latency trade-offs on image classification and achieves superior transfer learning results on several benchmarks, including object detection, instance segmentation, and semantic segmentation. The source code and pre-trained models are publicly available. The paper also discusses related work on CNN architectures, multi-path and featuremap attention, and neural architecture search. The Split-Attention Block is detailed, including its implementation and comparison with existing attention methods. The training strategies and ablation studies are described, and the results on ImageNet and various benchmarks are presented, demonstrating the effectiveness of ResNeSt.
Reach us at info@study.space
Understanding ResNeSt%3A Split-Attention Networks