META-LEARNING WITH LATENT EMBEDDING OPTIMIZATION

META-LEARNING WITH LATENT EMBEDDING OPTIMIZATION

26 Mar 2019 | Andrei A. Rusu, Dushyant Rao, Jakub Sygnowski, Oriol Vinyals, Razvan Pascanu, Simon Osindero & Raia Hadsell
The paper introduces Latent Embedding Optimization (LEO), a meta-learning technique that learns a low-dimensional latent embedding of model parameters and performs gradient-based meta-learning in this space. This approach decouples the gradient-based adaptation procedure from the high-dimensional space of model parameters, addressing the practical difficulties of existing meta-learning techniques in high-dimensional parameter spaces with limited data. LEO is evaluated on the competitive *miniImageNet* and *tieredImageNet* few-shot classification tasks, achieving state-of-the-art performance. The method captures uncertainty in the data and performs adaptation more effectively by optimizing in the latent space. The paper also includes an ablation study and analysis showing that both conditional parameter generation and optimization in latent space are critical for the success of LEO.The paper introduces Latent Embedding Optimization (LEO), a meta-learning technique that learns a low-dimensional latent embedding of model parameters and performs gradient-based meta-learning in this space. This approach decouples the gradient-based adaptation procedure from the high-dimensional space of model parameters, addressing the practical difficulties of existing meta-learning techniques in high-dimensional parameter spaces with limited data. LEO is evaluated on the competitive *miniImageNet* and *tieredImageNet* few-shot classification tasks, achieving state-of-the-art performance. The method captures uncertainty in the data and performs adaptation more effectively by optimizing in the latent space. The paper also includes an ablation study and analysis showing that both conditional parameter generation and optimization in latent space are critical for the success of LEO.
Reach us at info@study.space
Understanding Meta-Learning with Latent Embedding Optimization