Regularized Evolution for Image Classifier Architecture Search

Regularized Evolution for Image Classifier Architecture Search

16 Feb 2019 | Esteban Real†‡ and Alok Aggarwal† and Yanping Huang† and Quoc V. Le
This paper presents a novel approach to discovering image classifier architectures using evolutionary algorithms. The authors introduce a modified tournament selection evolutionary algorithm called "aging evolution," which introduces an age property to favor younger genotypes. This method is applied to the NASNet search space, a space of image classifiers with a fixed outer structure of Inception-like modules. The evolved architecture, named AmoebaNet-A, surpasses hand-designed models in accuracy and sets a new state-of-the-art accuracy of 83.9% top-1 and 96.6% top-5 on the ImageNet dataset. The paper also compares the evolutionary approach with reinforcement learning (RL) and random search, showing that evolution can achieve similar results with faster convergence, especially in resource-constrained scenarios. The authors further explore the benefits of aging evolution, suggesting that it helps navigate training noise and introduces regularization effects. The paper concludes by discussing future directions, including the analysis of architecture search experiments and the discovery of broader neural network design patterns.This paper presents a novel approach to discovering image classifier architectures using evolutionary algorithms. The authors introduce a modified tournament selection evolutionary algorithm called "aging evolution," which introduces an age property to favor younger genotypes. This method is applied to the NASNet search space, a space of image classifiers with a fixed outer structure of Inception-like modules. The evolved architecture, named AmoebaNet-A, surpasses hand-designed models in accuracy and sets a new state-of-the-art accuracy of 83.9% top-1 and 96.6% top-5 on the ImageNet dataset. The paper also compares the evolutionary approach with reinforcement learning (RL) and random search, showing that evolution can achieve similar results with faster convergence, especially in resource-constrained scenarios. The authors further explore the benefits of aging evolution, suggesting that it helps navigate training noise and introduces regularization effects. The paper concludes by discussing future directions, including the analysis of architecture search experiments and the discovery of broader neural network design patterns.
Reach us at info@study.space