METACoCo: A NEW FEW-SHOT CLASSIFICATION BENCHMARK WITH SPURIOUS CORRELATION

METACoCo: A NEW FEW-SHOT CLASSIFICATION BENCHMARK WITH SPURIOUS CORRELATION

30 Apr 2024 | Min Zhang, Haoxuan Li, Fei Wu, Kun Kuang
MetaCoCo is a new benchmark for few-shot classification (FSC) that focuses on spurious correlation shifts, which occur when novel classes differ from base classes in ways that are not captured by the training data. The benchmark aims to address the limitations of existing benchmarks, such as Meta-dataset and BSCD-FSL, which primarily focus on cross-domain shifts. MetaCoCo includes 175,637 images, 155 contexts, and 100 classes, and it introduces spurious correlations by labeling images with both main concepts and specific contexts. The benchmark is designed to evaluate the performance of various FSC methods, including fine-tuning-based, metric-based, and meta-based approaches, as well as self-supervised learning methods. Experiments show that existing methods struggle with spurious correlation shifts, highlighting the need for more robust models. The benchmark and all associated code are open-sourced to facilitate future research in this area.MetaCoCo is a new benchmark for few-shot classification (FSC) that focuses on spurious correlation shifts, which occur when novel classes differ from base classes in ways that are not captured by the training data. The benchmark aims to address the limitations of existing benchmarks, such as Meta-dataset and BSCD-FSL, which primarily focus on cross-domain shifts. MetaCoCo includes 175,637 images, 155 contexts, and 100 classes, and it introduces spurious correlations by labeling images with both main concepts and specific contexts. The benchmark is designed to evaluate the performance of various FSC methods, including fine-tuning-based, metric-based, and meta-based approaches, as well as self-supervised learning methods. Experiments show that existing methods struggle with spurious correlation shifts, highlighting the need for more robust models. The benchmark and all associated code are open-sourced to facilitate future research in this area.
Reach us at info@study.space