2 Mar 2018 | Mengye Ren†✉, Eleni Triantafillou*†✉, Sachin Ravi*§, Jake Snell†✉, Kevin Swersky†, Joshua B. Tenenbaum‡, Hugo Larochelle‡†, & Richard S. Zemel†‡✉
This paper addresses the challenge of few-shot classification, particularly in semi-supervised settings where unlabeled examples are available. The authors propose extensions to Prototypical Networks (Snell et al., 2017) to incorporate these unlabeled examples, which can be either from the same classes as the labeled examples or from distractor classes. The goal is to improve the model's ability to leverage unlabeled data for better generalization. The paper evaluates these methods on the Omniglot and miniImageNet benchmarks, adapted to include unlabeled examples. Additionally, a new dataset, tieredImageNet, is introduced, which has a hierarchical structure and more realistic few-shot learning scenarios. The experiments show that the proposed models outperform baseline methods, demonstrating the effectiveness of semi-supervised meta-learning in few-shot classification.This paper addresses the challenge of few-shot classification, particularly in semi-supervised settings where unlabeled examples are available. The authors propose extensions to Prototypical Networks (Snell et al., 2017) to incorporate these unlabeled examples, which can be either from the same classes as the labeled examples or from distractor classes. The goal is to improve the model's ability to leverage unlabeled data for better generalization. The paper evaluates these methods on the Omniglot and miniImageNet benchmarks, adapted to include unlabeled examples. Additionally, a new dataset, tieredImageNet, is introduced, which has a hierarchical structure and more realistic few-shot learning scenarios. The experiments show that the proposed models outperform baseline methods, demonstrating the effectiveness of semi-supervised meta-learning in few-shot classification.