Meta-Transfer Learning for Few-Shot Learning

Meta-Transfer Learning for Few-Shot Learning

9 Apr 2019 | Qianru Sun, Yaoyao Liu, Tat-Seng Chua, Bernt Schiele
This paper introduces a novel few-shot learning method called Meta-Transfer Learning (MTL), which aims to leverage deep neural networks (DNNs) for few-shot tasks. MTL addresses the challenge of overfitting in shallow neural networks (SNNs) by using DNNs and learning scaling and shifting functions of DNN weights for each task. The proposed method, Meta-Transfer Learning (MTL), is trained using a Hard Task (HT) meta-batch scheme, which dynamically selects harder tasks based on past failure cases to enhance learning efficiency. Extensive experiments on the miniImageNet and Fewshot-CIFAR100 datasets demonstrate that MTL, when combined with the HT meta-batch scheme, achieves state-of-the-art performance in few-shot learning tasks. The method's effectiveness is validated through ablation studies, which show that both MTL and HT meta-batch contribute to fast convergence and high accuracy.This paper introduces a novel few-shot learning method called Meta-Transfer Learning (MTL), which aims to leverage deep neural networks (DNNs) for few-shot tasks. MTL addresses the challenge of overfitting in shallow neural networks (SNNs) by using DNNs and learning scaling and shifting functions of DNN weights for each task. The proposed method, Meta-Transfer Learning (MTL), is trained using a Hard Task (HT) meta-batch scheme, which dynamically selects harder tasks based on past failure cases to enhance learning efficiency. Extensive experiments on the miniImageNet and Fewshot-CIFAR100 datasets demonstrate that MTL, when combined with the HT meta-batch scheme, achieves state-of-the-art performance in few-shot learning tasks. The method's effectiveness is validated through ablation studies, which show that both MTL and HT meta-batch contribute to fast convergence and high accuracy.
Reach us at info@study.space
[slides] Meta-Transfer Learning for Few-Shot Learning | StudySpace