04 January 2024 | Chenghao Lu, Emmanuel Nnadziozie, Moritz Paul Camenzind, Yuncai Hu and Kang Yu
This study presents a practical workflow for detecting maize plants using UAV-based RGB imaging and YOLOv5. The method combines UAV images with a semi-auto-labeling approach based on the Segment Anything Model (SAM) to reduce labeling effort. The trained YOLOv5 model achieved a mean average precision (mAP@0.5) of 0.828 and 0.863 for the 3-leaf and 7-leaf stages, respectively. The model performed well under challenging conditions such as overgrown weeds, leaf occlusion, and blurry images. Introducing image-rotation augmentation and low noise weight improved model accuracy, with increases of 0.024 and 0.016 mAP@0.5, respectively. The study also evaluated the performance of different models, including those with rotation data augmentation and low-noise datasets. The results showed that rotation-based data augmentation significantly improved model performance, particularly at the 3-leaf stage. The study highlights the potential of YOLOv5 for plant detection in real-world conditions and suggests that further research is needed to improve model accuracy and efficiency. The study also discusses the challenges of labeling and proposes a semi-auto-labeling method based on SAM to reduce manual annotation efforts. The results demonstrate that YOLOv5-based maize detection models can be effectively applied to UAVs and other IoT devices for real-time plant monitoring.This study presents a practical workflow for detecting maize plants using UAV-based RGB imaging and YOLOv5. The method combines UAV images with a semi-auto-labeling approach based on the Segment Anything Model (SAM) to reduce labeling effort. The trained YOLOv5 model achieved a mean average precision (mAP@0.5) of 0.828 and 0.863 for the 3-leaf and 7-leaf stages, respectively. The model performed well under challenging conditions such as overgrown weeds, leaf occlusion, and blurry images. Introducing image-rotation augmentation and low noise weight improved model accuracy, with increases of 0.024 and 0.016 mAP@0.5, respectively. The study also evaluated the performance of different models, including those with rotation data augmentation and low-noise datasets. The results showed that rotation-based data augmentation significantly improved model performance, particularly at the 3-leaf stage. The study highlights the potential of YOLOv5 for plant detection in real-world conditions and suggests that further research is needed to improve model accuracy and efficiency. The study also discusses the challenges of labeling and proposes a semi-auto-labeling method based on SAM to reduce manual annotation efforts. The results demonstrate that YOLOv5-based maize detection models can be effectively applied to UAVs and other IoT devices for real-time plant monitoring.