Decoupling Representation and Classifier for Long-Tailed Recognition

Decoupling Representation and Classifier for Long-Tailed Recognition

19 Feb 2020 | Bingyi Kang1,2, Saining Xie1, Marcus Rohrbach1, Zhicheng Yan1, Albert Gordo1, Jiashi Feng2, Yannis Kalantidis1
This paper presents a novel approach to long-tailed recognition by decoupling representation learning and classifier learning. Long-tailed distributions in real-world data pose significant challenges for deep learning models, as class imbalance can lead to poor performance on rare classes. Existing methods typically involve class-balancing strategies such as loss re-weighting, data re-sampling, or transfer learning from head to tail classes, but these approaches often jointly learn representations and classifiers. In this work, we decouple the learning process into representation learning and classification, and systematically explore how different balancing strategies affect them for long-tailed recognition. Our findings reveal that data imbalance may not be a significant issue in learning high-quality representations, and that simple instance-balanced sampling can lead to strong long-tailed recognition performance when combined with appropriate classifier adjustments. We conduct extensive experiments on common long-tailed benchmarks such as ImageNet-LT, Places-LT, and iNaturalist, showing that our approach outperforms carefully designed losses, sampling strategies, and complex modules with memory. Our code is available at https://github.com/facebookresearch/classifier-balancing. The paper also discusses various sampling strategies, classifier learning methods, and evaluates their effectiveness in long-tailed recognition. The results show that decoupling representation and classification can lead to significant improvements in performance, particularly for rare classes. The paper concludes that this approach offers a simple and effective solution for long-tailed recognition without the need for complex modules or carefully designed losses.This paper presents a novel approach to long-tailed recognition by decoupling representation learning and classifier learning. Long-tailed distributions in real-world data pose significant challenges for deep learning models, as class imbalance can lead to poor performance on rare classes. Existing methods typically involve class-balancing strategies such as loss re-weighting, data re-sampling, or transfer learning from head to tail classes, but these approaches often jointly learn representations and classifiers. In this work, we decouple the learning process into representation learning and classification, and systematically explore how different balancing strategies affect them for long-tailed recognition. Our findings reveal that data imbalance may not be a significant issue in learning high-quality representations, and that simple instance-balanced sampling can lead to strong long-tailed recognition performance when combined with appropriate classifier adjustments. We conduct extensive experiments on common long-tailed benchmarks such as ImageNet-LT, Places-LT, and iNaturalist, showing that our approach outperforms carefully designed losses, sampling strategies, and complex modules with memory. Our code is available at https://github.com/facebookresearch/classifier-balancing. The paper also discusses various sampling strategies, classifier learning methods, and evaluates their effectiveness in long-tailed recognition. The results show that decoupling representation and classification can lead to significant improvements in performance, particularly for rare classes. The paper concludes that this approach offers a simple and effective solution for long-tailed recognition without the need for complex modules or carefully designed losses.
Reach us at info@study.space
[slides and audio] Decoupling Representation and Classifier for Long-Tailed Recognition