Chenghao Lu , Klaus Gehring , Stefan Kopfinger , Heinz Bernhardt , Michael Beck , Simon Walther , Thomas Ebertseder , Mirjana Minceva , Yuncai Hu , Kang Yu
{"title":"基于深度学习的无人机正正交图像杂草实例分割","authors":"Chenghao Lu , Klaus Gehring , Stefan Kopfinger , Heinz Bernhardt , Michael Beck , Simon Walther , Thomas Ebertseder , Mirjana Minceva , Yuncai Hu , Kang Yu","doi":"10.1016/j.atech.2025.100966","DOIUrl":null,"url":null,"abstract":"<div><div>Weeds significantly impact agricultural production, and traditional weed control methods often harm soil health and environment. This study aimed to develop deep learning-based segmentation models in identifying weeds in potato fields captured by Unmanned Aerial Vehicle (UAV<em>)</em> orthophotos and to explore the effects of weeds on potato yield. Previous studies predominantly employed U-Net for weed segmentation, but its performance often declines under complex field environments and low-image resolution conditions. Some studies attempted to overcome this limitation by reducing flight altitude or using high-cost cameras, but these approaches are not always practical. To address these challenges, this study uniquely integrated Real-ESRGAN Super-Resolution (SR) for UAV image enhancement and the Segment Anything Model (SAM) for semi-automatic annotation. Subsequently, we trained the YOLOv8 and Mask R-CNN models for segmentation. Results showed that the detection accuracy mAP50 scores were 0.902 and 0.920 for YOLOv8 and Mask R-CNN, respectively. Real-ESRGAN reconstruction slightly improved accuracy. When multiple weed types were present, accuracy generally decreased. The YOLOv8 model characterized plant and weed coverage areas could explained 41.2 % of potato yield variations (R<sup>2</sup> = 0.412, p-value = 0.01), underscoring the practical utility of UAV-based segmentation for yield estimation. Both YOLOv8 and Mask R-CNN achieved high accuracy, with YOLOv8 converging faster. While different nitrogen fertilizer treatments had no significant effect on yield, weed control treatments significantly impacted yield, highlighting the importance of precise weed mapping for spot-specific weed management. This study provides insights into weed segmentation using Deep Leaning and contributes to environmentally friendly precision weed control.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"11 ","pages":"Article 100966"},"PeriodicalIF":6.3000,"publicationDate":"2025-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Weed instance segmentation from UAV Orthomosaic Images based on Deep Learning\",\"authors\":\"Chenghao Lu , Klaus Gehring , Stefan Kopfinger , Heinz Bernhardt , Michael Beck , Simon Walther , Thomas Ebertseder , Mirjana Minceva , Yuncai Hu , Kang Yu\",\"doi\":\"10.1016/j.atech.2025.100966\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Weeds significantly impact agricultural production, and traditional weed control methods often harm soil health and environment. This study aimed to develop deep learning-based segmentation models in identifying weeds in potato fields captured by Unmanned Aerial Vehicle (UAV<em>)</em> orthophotos and to explore the effects of weeds on potato yield. Previous studies predominantly employed U-Net for weed segmentation, but its performance often declines under complex field environments and low-image resolution conditions. Some studies attempted to overcome this limitation by reducing flight altitude or using high-cost cameras, but these approaches are not always practical. To address these challenges, this study uniquely integrated Real-ESRGAN Super-Resolution (SR) for UAV image enhancement and the Segment Anything Model (SAM) for semi-automatic annotation. Subsequently, we trained the YOLOv8 and Mask R-CNN models for segmentation. Results showed that the detection accuracy mAP50 scores were 0.902 and 0.920 for YOLOv8 and Mask R-CNN, respectively. Real-ESRGAN reconstruction slightly improved accuracy. When multiple weed types were present, accuracy generally decreased. The YOLOv8 model characterized plant and weed coverage areas could explained 41.2 % of potato yield variations (R<sup>2</sup> = 0.412, p-value = 0.01), underscoring the practical utility of UAV-based segmentation for yield estimation. Both YOLOv8 and Mask R-CNN achieved high accuracy, with YOLOv8 converging faster. While different nitrogen fertilizer treatments had no significant effect on yield, weed control treatments significantly impacted yield, highlighting the importance of precise weed mapping for spot-specific weed management. This study provides insights into weed segmentation using Deep Leaning and contributes to environmentally friendly precision weed control.</div></div>\",\"PeriodicalId\":74813,\"journal\":{\"name\":\"Smart agricultural technology\",\"volume\":\"11 \",\"pages\":\"Article 100966\"},\"PeriodicalIF\":6.3000,\"publicationDate\":\"2025-04-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Smart agricultural technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2772375525001996\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURAL ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Smart agricultural technology","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772375525001996","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
Weed instance segmentation from UAV Orthomosaic Images based on Deep Learning
Weeds significantly impact agricultural production, and traditional weed control methods often harm soil health and environment. This study aimed to develop deep learning-based segmentation models in identifying weeds in potato fields captured by Unmanned Aerial Vehicle (UAV) orthophotos and to explore the effects of weeds on potato yield. Previous studies predominantly employed U-Net for weed segmentation, but its performance often declines under complex field environments and low-image resolution conditions. Some studies attempted to overcome this limitation by reducing flight altitude or using high-cost cameras, but these approaches are not always practical. To address these challenges, this study uniquely integrated Real-ESRGAN Super-Resolution (SR) for UAV image enhancement and the Segment Anything Model (SAM) for semi-automatic annotation. Subsequently, we trained the YOLOv8 and Mask R-CNN models for segmentation. Results showed that the detection accuracy mAP50 scores were 0.902 and 0.920 for YOLOv8 and Mask R-CNN, respectively. Real-ESRGAN reconstruction slightly improved accuracy. When multiple weed types were present, accuracy generally decreased. The YOLOv8 model characterized plant and weed coverage areas could explained 41.2 % of potato yield variations (R2 = 0.412, p-value = 0.01), underscoring the practical utility of UAV-based segmentation for yield estimation. Both YOLOv8 and Mask R-CNN achieved high accuracy, with YOLOv8 converging faster. While different nitrogen fertilizer treatments had no significant effect on yield, weed control treatments significantly impacted yield, highlighting the importance of precise weed mapping for spot-specific weed management. This study provides insights into weed segmentation using Deep Leaning and contributes to environmentally friendly precision weed control.