基于深度学习的无人机正正交图像杂草实例分割

IF 6.3 Q1 AGRICULTURAL ENGINEERING
Chenghao Lu , Klaus Gehring , Stefan Kopfinger , Heinz Bernhardt , Michael Beck , Simon Walther , Thomas Ebertseder , Mirjana Minceva , Yuncai Hu , Kang Yu
{"title":"基于深度学习的无人机正正交图像杂草实例分割","authors":"Chenghao Lu ,&nbsp;Klaus Gehring ,&nbsp;Stefan Kopfinger ,&nbsp;Heinz Bernhardt ,&nbsp;Michael Beck ,&nbsp;Simon Walther ,&nbsp;Thomas Ebertseder ,&nbsp;Mirjana Minceva ,&nbsp;Yuncai Hu ,&nbsp;Kang Yu","doi":"10.1016/j.atech.2025.100966","DOIUrl":null,"url":null,"abstract":"<div><div>Weeds significantly impact agricultural production, and traditional weed control methods often harm soil health and environment. This study aimed to develop deep learning-based segmentation models in identifying weeds in potato fields captured by Unmanned Aerial Vehicle (UAV<em>)</em> orthophotos and to explore the effects of weeds on potato yield. Previous studies predominantly employed U-Net for weed segmentation, but its performance often declines under complex field environments and low-image resolution conditions. Some studies attempted to overcome this limitation by reducing flight altitude or using high-cost cameras, but these approaches are not always practical. To address these challenges, this study uniquely integrated Real-ESRGAN Super-Resolution (SR) for UAV image enhancement and the Segment Anything Model (SAM) for semi-automatic annotation. Subsequently, we trained the YOLOv8 and Mask R-CNN models for segmentation. Results showed that the detection accuracy mAP50 scores were 0.902 and 0.920 for YOLOv8 and Mask R-CNN, respectively. Real-ESRGAN reconstruction slightly improved accuracy. When multiple weed types were present, accuracy generally decreased. The YOLOv8 model characterized plant and weed coverage areas could explained 41.2 % of potato yield variations (R<sup>2</sup> = 0.412, p-value = 0.01), underscoring the practical utility of UAV-based segmentation for yield estimation. Both YOLOv8 and Mask R-CNN achieved high accuracy, with YOLOv8 converging faster. While different nitrogen fertilizer treatments had no significant effect on yield, weed control treatments significantly impacted yield, highlighting the importance of precise weed mapping for spot-specific weed management. This study provides insights into weed segmentation using Deep Leaning and contributes to environmentally friendly precision weed control.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"11 ","pages":"Article 100966"},"PeriodicalIF":6.3000,"publicationDate":"2025-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Weed instance segmentation from UAV Orthomosaic Images based on Deep Learning\",\"authors\":\"Chenghao Lu ,&nbsp;Klaus Gehring ,&nbsp;Stefan Kopfinger ,&nbsp;Heinz Bernhardt ,&nbsp;Michael Beck ,&nbsp;Simon Walther ,&nbsp;Thomas Ebertseder ,&nbsp;Mirjana Minceva ,&nbsp;Yuncai Hu ,&nbsp;Kang Yu\",\"doi\":\"10.1016/j.atech.2025.100966\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Weeds significantly impact agricultural production, and traditional weed control methods often harm soil health and environment. This study aimed to develop deep learning-based segmentation models in identifying weeds in potato fields captured by Unmanned Aerial Vehicle (UAV<em>)</em> orthophotos and to explore the effects of weeds on potato yield. Previous studies predominantly employed U-Net for weed segmentation, but its performance often declines under complex field environments and low-image resolution conditions. Some studies attempted to overcome this limitation by reducing flight altitude or using high-cost cameras, but these approaches are not always practical. To address these challenges, this study uniquely integrated Real-ESRGAN Super-Resolution (SR) for UAV image enhancement and the Segment Anything Model (SAM) for semi-automatic annotation. Subsequently, we trained the YOLOv8 and Mask R-CNN models for segmentation. Results showed that the detection accuracy mAP50 scores were 0.902 and 0.920 for YOLOv8 and Mask R-CNN, respectively. Real-ESRGAN reconstruction slightly improved accuracy. When multiple weed types were present, accuracy generally decreased. The YOLOv8 model characterized plant and weed coverage areas could explained 41.2 % of potato yield variations (R<sup>2</sup> = 0.412, p-value = 0.01), underscoring the practical utility of UAV-based segmentation for yield estimation. Both YOLOv8 and Mask R-CNN achieved high accuracy, with YOLOv8 converging faster. While different nitrogen fertilizer treatments had no significant effect on yield, weed control treatments significantly impacted yield, highlighting the importance of precise weed mapping for spot-specific weed management. This study provides insights into weed segmentation using Deep Leaning and contributes to environmentally friendly precision weed control.</div></div>\",\"PeriodicalId\":74813,\"journal\":{\"name\":\"Smart agricultural technology\",\"volume\":\"11 \",\"pages\":\"Article 100966\"},\"PeriodicalIF\":6.3000,\"publicationDate\":\"2025-04-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Smart agricultural technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2772375525001996\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURAL ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Smart agricultural technology","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772375525001996","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
引用次数: 0

摘要

杂草严重影响农业生产,传统的杂草防治方法往往危害土壤健康和环境。本研究旨在建立基于深度学习的无人机正射影像仪马铃薯田杂草识别分割模型,探讨杂草对马铃薯产量的影响。以往的研究主要采用U-Net进行杂草分割,但在复杂的野外环境和低图像分辨率条件下,U-Net的分割效果往往下降。一些研究试图通过降低飞行高度或使用高成本相机来克服这一限制,但这些方法并不总是实用的。为了应对这些挑战,本研究独特地集成了Real-ESRGAN超分辨率(SR)无人机图像增强和分段任意模型(SAM)半自动注释。随后,我们训练了YOLOv8和Mask R-CNN模型进行分割。结果表明,YOLOv8和Mask R-CNN的检测准确率mAP50评分分别为0.902和0.920。Real-ESRGAN重建精度略有提高。当存在多种杂草类型时,准确性通常会下降。YOLOv8模型表征的植物和杂草覆盖面积可以解释41.2%的马铃薯产量变化(R2 = 0.412, p值= 0.01),强调了基于无人机的分割在产量估计中的实用价值。YOLOv8和Mask R-CNN都取得了很高的精度,YOLOv8收敛速度更快。不同氮肥处理对产量的影响不显著,而杂草控制处理对产量的影响显著,这凸显了精确的杂草定位对特定地点杂草管理的重要性。该研究提供了利用深度学习对杂草进行分类的见解,并有助于对环境友好的杂草进行精确控制。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Weed instance segmentation from UAV Orthomosaic Images based on Deep Learning

Weed instance segmentation from UAV Orthomosaic Images based on Deep Learning
Weeds significantly impact agricultural production, and traditional weed control methods often harm soil health and environment. This study aimed to develop deep learning-based segmentation models in identifying weeds in potato fields captured by Unmanned Aerial Vehicle (UAV) orthophotos and to explore the effects of weeds on potato yield. Previous studies predominantly employed U-Net for weed segmentation, but its performance often declines under complex field environments and low-image resolution conditions. Some studies attempted to overcome this limitation by reducing flight altitude or using high-cost cameras, but these approaches are not always practical. To address these challenges, this study uniquely integrated Real-ESRGAN Super-Resolution (SR) for UAV image enhancement and the Segment Anything Model (SAM) for semi-automatic annotation. Subsequently, we trained the YOLOv8 and Mask R-CNN models for segmentation. Results showed that the detection accuracy mAP50 scores were 0.902 and 0.920 for YOLOv8 and Mask R-CNN, respectively. Real-ESRGAN reconstruction slightly improved accuracy. When multiple weed types were present, accuracy generally decreased. The YOLOv8 model characterized plant and weed coverage areas could explained 41.2 % of potato yield variations (R2 = 0.412, p-value = 0.01), underscoring the practical utility of UAV-based segmentation for yield estimation. Both YOLOv8 and Mask R-CNN achieved high accuracy, with YOLOv8 converging faster. While different nitrogen fertilizer treatments had no significant effect on yield, weed control treatments significantly impacted yield, highlighting the importance of precise weed mapping for spot-specific weed management. This study provides insights into weed segmentation using Deep Leaning and contributes to environmentally friendly precision weed control.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
4.20
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信