A novel lightweight real-time traffic sign detection method based on an embedded device and YOLOv8

A novel lightweight real-time traffic sign detection method based on an embedded device and YOLOv8

26 January 2024 | Yuechen Luo, Yusheng Ci, Shixin Jiang, Xiaoli Wei
This paper proposes a novel lightweight real-time traffic sign detection method based on an embedded device and YOLOv8. The method combines the Ghost module and Efficient Multi-Scale Attention Module into YOLOv8 to enhance computational speed while maintaining accuracy. The Raspberry Pi 4B is chosen as the detection device due to its lightweight and low power consumption. The model is trained on the CCTSDB dataset and tested on the Raspberry Pi 4B. The results show that the improved algorithm achieves a mAP of 93.5% on a poor weather test set and 82.9% on the original test set, with an inference delay of 0.79 seconds. Compared to the original model, the improved model increases accuracy by 6.4% and 3.8% and reduces detection time by 0.12 seconds. This research is significant for the advancement of assisted and autonomous driving technologies. Traffic sign recognition is a critical component of intelligent driving technologies, enhancing safety by detecting and providing real-time information about traffic signs. The paper reviews the evolution of traffic sign detection methods, from classical computational vision techniques to deep learning approaches, particularly focusing on YOLOv8. It highlights the challenges of traditional methods, such as environmental factors and internal conditions, and the benefits of deep learning in improving detection accuracy. The paper also discusses the importance of edge computing and lightweight networks for real-time applications, emphasizing the need for efficient and accurate traffic sign detection in autonomous driving systems. The proposed method, Yolov8-ghost-EMA, is designed to address the detection of small objects and achieve high accuracy and speed. The research methodology, including the selection of the dataset and network structure, is detailed in Sect. 2, followed by experimental results and discussions in Sect. 3 and Sect. 4.This paper proposes a novel lightweight real-time traffic sign detection method based on an embedded device and YOLOv8. The method combines the Ghost module and Efficient Multi-Scale Attention Module into YOLOv8 to enhance computational speed while maintaining accuracy. The Raspberry Pi 4B is chosen as the detection device due to its lightweight and low power consumption. The model is trained on the CCTSDB dataset and tested on the Raspberry Pi 4B. The results show that the improved algorithm achieves a mAP of 93.5% on a poor weather test set and 82.9% on the original test set, with an inference delay of 0.79 seconds. Compared to the original model, the improved model increases accuracy by 6.4% and 3.8% and reduces detection time by 0.12 seconds. This research is significant for the advancement of assisted and autonomous driving technologies. Traffic sign recognition is a critical component of intelligent driving technologies, enhancing safety by detecting and providing real-time information about traffic signs. The paper reviews the evolution of traffic sign detection methods, from classical computational vision techniques to deep learning approaches, particularly focusing on YOLOv8. It highlights the challenges of traditional methods, such as environmental factors and internal conditions, and the benefits of deep learning in improving detection accuracy. The paper also discusses the importance of edge computing and lightweight networks for real-time applications, emphasizing the need for efficient and accurate traffic sign detection in autonomous driving systems. The proposed method, Yolov8-ghost-EMA, is designed to address the detection of small objects and achieve high accuracy and speed. The research methodology, including the selection of the dataset and network structure, is detailed in Sect. 2, followed by experimental results and discussions in Sect. 3 and Sect. 4.
Reach us at info@study.space