This paper revisits the one-shot Neural Architecture Search (NAS) paradigm and addresses the challenges of training and effectiveness on large datasets like ImageNet. The authors propose a Single Path One-Shot model, which constructs a simplified supernet with single paths to alleviate weight co-adaption issues. Training is performed using uniform path sampling, where all architectures and their weights are trained fully and equally. The approach is flexible and efficient, supporting complex search spaces and different constraints. Comprehensive experiments on ImageNet demonstrate that the proposed method achieves state-of-the-art performance in terms of accuracy, memory consumption, training time, and architecture search efficiency. The paper also introduces novel choice blocks for channel number and mixed-precision quantization search, and uses evolutionary algorithms for architecture search. The method is shown to be superior to previous approaches in various aspects, including simplicity, efficiency, and flexibility.This paper revisits the one-shot Neural Architecture Search (NAS) paradigm and addresses the challenges of training and effectiveness on large datasets like ImageNet. The authors propose a Single Path One-Shot model, which constructs a simplified supernet with single paths to alleviate weight co-adaption issues. Training is performed using uniform path sampling, where all architectures and their weights are trained fully and equally. The approach is flexible and efficient, supporting complex search spaces and different constraints. Comprehensive experiments on ImageNet demonstrate that the proposed method achieves state-of-the-art performance in terms of accuracy, memory consumption, training time, and architecture search efficiency. The paper also introduces novel choice blocks for channel number and mixed-precision quantization search, and uses evolutionary algorithms for architecture search. The method is shown to be superior to previous approaches in various aspects, including simplicity, efficiency, and flexibility.