FG‐UNet: fine‐grained feature‐guided UNet for segmentation of weeds and crops in UAV images
IF 3.8
1区 农林科学
Q1 AGRONOMY
Jianwu Lin, Xin Zhang, Yongbin Qin, Shengxian Yang, Xingtian Wen, Tomislav Cernava, Xiaoyulong Chen
求助PDF
{"title":"FG‐UNet: fine‐grained feature‐guided UNet for segmentation of weeds and crops in UAV images","authors":"Jianwu Lin, Xin Zhang, Yongbin Qin, Shengxian Yang, Xingtian Wen, Tomislav Cernava, Xiaoyulong Chen","doi":"10.1002/ps.8489","DOIUrl":null,"url":null,"abstract":"BACKGROUNDSemantic segmentation of weed and crop images is a key component and prerequisite for automated weed management. For weeds in unmanned aerial vehicle (UAV) images, which are usually characterized by small size and easily confused with crops at early growth stages, existing semantic segmentation models have difficulties to extract sufficiently fine features. This leads to their limited performance in weed and crop segmentation of UAV images.RESULTSWe proposed a fine‐grained feature‐guided UNet, named FG‐UNet, for weed and crop segmentation in UAV images. Specifically, there are two branches in FG‐UNet, namely the fine‐grained feature branch and the UNet branch. In the fine‐grained feature branch, a fine feature‐aware (FFA) module was designed to mine fine features in order to enhance the model's ability to segment small objects. In the UNet branch, we used an encoder–decoder structure to realize high‐level semantic feature extraction in images. In addition, a contextual feature fusion (CFF) module was designed for the fusion of the fine features and high‐level semantic features, thus enhancing the feature discrimination capability of the model. The experimental results showed that our proposed FG‐UNet, achieved state‐of‐the‐art performance compared to other semantic segmentation models, with mean intersection over union (MIOU) and mean pixel accuracy (MPA) of 88.06% and 92.37%, respectively.CONCLUSIONThe proposed method in this study lays a solid foundation for accurate detection and intelligent management of weeds. It will have a positive impact on the development of smart agriculture. © 2024 Society of Chemical Industry.","PeriodicalId":218,"journal":{"name":"Pest Management Science","volume":"67 1","pages":""},"PeriodicalIF":3.8000,"publicationDate":"2024-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pest Management Science","FirstCategoryId":"97","ListUrlMain":"https://doi.org/10.1002/ps.8489","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRONOMY","Score":null,"Total":0}
引用次数: 0
引用
批量引用
Abstract
BACKGROUNDSemantic segmentation of weed and crop images is a key component and prerequisite for automated weed management. For weeds in unmanned aerial vehicle (UAV) images, which are usually characterized by small size and easily confused with crops at early growth stages, existing semantic segmentation models have difficulties to extract sufficiently fine features. This leads to their limited performance in weed and crop segmentation of UAV images.RESULTSWe proposed a fine‐grained feature‐guided UNet, named FG‐UNet, for weed and crop segmentation in UAV images. Specifically, there are two branches in FG‐UNet, namely the fine‐grained feature branch and the UNet branch. In the fine‐grained feature branch, a fine feature‐aware (FFA) module was designed to mine fine features in order to enhance the model's ability to segment small objects. In the UNet branch, we used an encoder–decoder structure to realize high‐level semantic feature extraction in images. In addition, a contextual feature fusion (CFF) module was designed for the fusion of the fine features and high‐level semantic features, thus enhancing the feature discrimination capability of the model. The experimental results showed that our proposed FG‐UNet, achieved state‐of‐the‐art performance compared to other semantic segmentation models, with mean intersection over union (MIOU) and mean pixel accuracy (MPA) of 88.06% and 92.37%, respectively.CONCLUSIONThe proposed method in this study lays a solid foundation for accurate detection and intelligent management of weeds. It will have a positive impact on the development of smart agriculture. © 2024 Society of Chemical Industry.
FG-UNet:用于分割无人机图像中杂草和作物的细粒度特征引导 UNet
背景杂草和作物图像的语义分割是杂草自动管理的关键组成部分和先决条件。对于无人飞行器(UAV)图像中的杂草,现有的语义分割模型难以提取足够精细的特征,而杂草通常具有体积小的特点,在生长初期很容易与农作物混淆。结果 我们提出了一种细粒度特征引导的 UNet,命名为 FG-UNet,用于无人机图像中的杂草和作物分割。具体来说,FG-UNet 有两个分支,即细粒度特征分支和 UNet 分支。在细粒度特征分支中,我们设计了一个精细特征感知(FFA)模块来挖掘精细特征,以增强模型分割小物体的能力。在 UNet 分支中,我们使用编码器-解码器结构来实现图像中的高级语义特征提取。此外,我们还设计了一个上下文特征融合(CFF)模块,用于融合精细特征和高级语义特征,从而增强模型的特征判别能力。实验结果表明,与其他语义分割模型相比,我们提出的 FG-UNet 达到了最先进的性能,平均交集大于联合(MIOU)和平均像素精度(MPA)分别为 88.06% 和 92.37%。它将对智能农业的发展产生积极影响。© 2024 化学工业协会。
本文章由计算机程序翻译,如有差异,请以英文原文为准。