MnasNet: Platform-Aware Neural Architecture Search for Mobile

MnasNet: Platform-Aware Neural Architecture Search for Mobile

29 May 2019 | Mingxing Tan, Bo Chen, Ruoming Pang, Vijay Vasudevan, Mark Sandler, Andrew Howard, Quoc V. Le
The paper "MnasNet: Platform-Aware Neural Architecture Search for Mobile" addresses the challenge of designing convolutional neural networks (CNNs) that are small, fast, and accurate for mobile devices. It proposes an automated neural architecture search (NAS) approach, MnasNet, which explicitly incorporates model latency into the main objective to balance accuracy and latency. Unlike previous methods that use FLOPS as a proxy for latency, MnasNet directly measures real-world inference latency by executing models on mobile phones. To enhance layer diversity and maintain a balanced search space, the paper introduces a novel factorized hierarchical search space. Experimental results show that MnasNet outperforms state-of-the-art mobile CNN models on multiple vision tasks, achieving 75.2% top-1 accuracy with 78ms latency on a Pixel phone, which is 1.8× faster than MobileNetV2 and 2.3× faster than NASNet while maintaining higher accuracy. MnasNet also improves mAP quality in COCO object detection. The paper's main contributions include a multi-objective NAS approach, a novel search space, and state-of-the-art performance on ImageNet and COCO datasets.The paper "MnasNet: Platform-Aware Neural Architecture Search for Mobile" addresses the challenge of designing convolutional neural networks (CNNs) that are small, fast, and accurate for mobile devices. It proposes an automated neural architecture search (NAS) approach, MnasNet, which explicitly incorporates model latency into the main objective to balance accuracy and latency. Unlike previous methods that use FLOPS as a proxy for latency, MnasNet directly measures real-world inference latency by executing models on mobile phones. To enhance layer diversity and maintain a balanced search space, the paper introduces a novel factorized hierarchical search space. Experimental results show that MnasNet outperforms state-of-the-art mobile CNN models on multiple vision tasks, achieving 75.2% top-1 accuracy with 78ms latency on a Pixel phone, which is 1.8× faster than MobileNetV2 and 2.3× faster than NASNet while maintaining higher accuracy. MnasNet also improves mAP quality in COCO object detection. The paper's main contributions include a multi-objective NAS approach, a novel search space, and state-of-the-art performance on ImageNet and COCO datasets.
Reach us at info@study.space
[slides] MnasNet%3A Platform-Aware Neural Architecture Search for Mobile | StudySpace