This paper proposes a Single Path One-Shot Neural Architecture Search (NAS) method that addresses the challenges of training and search efficiency in existing one-shot NAS approaches. The key idea is to construct a simplified supernet where all architectures are single paths, reducing weight co-adaptation issues. Training is performed through uniform path sampling, allowing all architectures and their weights to be trained fully and equally. This approach is flexible, effective, and efficient, supporting complex search spaces and different constraints. Comprehensive experiments show that the proposed method achieves state-of-the-art performance on ImageNet, with high accuracy, low memory consumption, and fast training and search times. The method also supports various real-world constraints, such as latency and mixed-precision quantization. The approach uses an evolutionary algorithm for architecture search, which is efficient and flexible, enabling the search of multiple architectures under different constraints. The method is compared with existing NAS approaches and shows superior performance in terms of accuracy, memory usage, and search efficiency. The results demonstrate that the proposed approach is effective and efficient for various NAS tasks.This paper proposes a Single Path One-Shot Neural Architecture Search (NAS) method that addresses the challenges of training and search efficiency in existing one-shot NAS approaches. The key idea is to construct a simplified supernet where all architectures are single paths, reducing weight co-adaptation issues. Training is performed through uniform path sampling, allowing all architectures and their weights to be trained fully and equally. This approach is flexible, effective, and efficient, supporting complex search spaces and different constraints. Comprehensive experiments show that the proposed method achieves state-of-the-art performance on ImageNet, with high accuracy, low memory consumption, and fast training and search times. The method also supports various real-world constraints, such as latency and mixed-precision quantization. The approach uses an evolutionary algorithm for architecture search, which is efficient and flexible, enabling the search of multiple architectures under different constraints. The method is compared with existing NAS approaches and shows superior performance in terms of accuracy, memory usage, and search efficiency. The results demonstrate that the proposed approach is effective and efficient for various NAS tasks.