Morphology-based weed type recognition using Siamese network

IF 4.5 1区 农林科学 Q1 AGRONOMY
A.S.M. Mahmudul Hasan , Dean Diepeveen , Hamid Laga , Michael G.K. Jones , A.A.M. Muzahid , Ferdous Sohel
{"title":"Morphology-based weed type recognition using Siamese network","authors":"A.S.M. Mahmudul Hasan ,&nbsp;Dean Diepeveen ,&nbsp;Hamid Laga ,&nbsp;Michael G.K. Jones ,&nbsp;A.A.M. Muzahid ,&nbsp;Ferdous Sohel","doi":"10.1016/j.eja.2024.127439","DOIUrl":null,"url":null,"abstract":"<div><div>Automatic weed detection and classification can significantly reduce weed management costs and improve crop yields and quality. Weed detection in crops from imagery is inherently a challenging problem. Because both weeds and crops are of similar colour (green on green), their growth and texture are somewhat similar; weeds also vary based on crops, geographical locations, seasons and even weather patterns. This study proposes a novel approach utilising object detection and meta-learning techniques for generalised weed detection, transcending the limitations of varying field contexts. Instead of classifying weeds by species, this study classified them based on their morphological families aligned with farming practices. An object detector, e.g., a YOLO (You Only Look Once) model is employed for plant detection, while a Siamese network, leveraging state-of-the-art deep learning models as its backbone, is used for weed classification. This study repurposed and used three publicly available datasets, namely, Weed25, Cotton weed and Corn weed data. Each dataset contained multiple species of weeds, whereas this study grouped those into three classes based on the weed morphology. YOLOv7 achieved the best result as a plant detector, and the VGG16 model as the feature extractor for the Siamese network. Moreover, the models were trained on one dataset (Weed25) and applied to other datasets (Cotton weed and Corn weed) without further training. The study also observed that the classification accuracy of the Siamese network was improved using the cosine similarity function for calculating contrastive loss. The YOLOv7 models obtained the mAP of 91.03 % on the Weed25 dataset, which was used for training the model. The mAPs for the unseen datasets were 84.65 % and 81.16 %. As mentioned earlier, the classification accuracies with the best combination were 97.59 %, 93.67 % and 93.35 % for the Weed25, Cotton weed and Corn weed datasets, respectively. This study also compared the classification performance of our proposed technique with the state-of-the-art Convolutional Neural Network models. The proposed approach advances weed classification accuracy and presents a viable solution for dataset independent, i.e., site-independent weed detection, fostering sustainable agricultural practices.</div></div>","PeriodicalId":51045,"journal":{"name":"European Journal of Agronomy","volume":"163 ","pages":"Article 127439"},"PeriodicalIF":4.5000,"publicationDate":"2024-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"European Journal of Agronomy","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1161030124003605","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRONOMY","Score":null,"Total":0}
引用次数: 0

Abstract

Automatic weed detection and classification can significantly reduce weed management costs and improve crop yields and quality. Weed detection in crops from imagery is inherently a challenging problem. Because both weeds and crops are of similar colour (green on green), their growth and texture are somewhat similar; weeds also vary based on crops, geographical locations, seasons and even weather patterns. This study proposes a novel approach utilising object detection and meta-learning techniques for generalised weed detection, transcending the limitations of varying field contexts. Instead of classifying weeds by species, this study classified them based on their morphological families aligned with farming practices. An object detector, e.g., a YOLO (You Only Look Once) model is employed for plant detection, while a Siamese network, leveraging state-of-the-art deep learning models as its backbone, is used for weed classification. This study repurposed and used three publicly available datasets, namely, Weed25, Cotton weed and Corn weed data. Each dataset contained multiple species of weeds, whereas this study grouped those into three classes based on the weed morphology. YOLOv7 achieved the best result as a plant detector, and the VGG16 model as the feature extractor for the Siamese network. Moreover, the models were trained on one dataset (Weed25) and applied to other datasets (Cotton weed and Corn weed) without further training. The study also observed that the classification accuracy of the Siamese network was improved using the cosine similarity function for calculating contrastive loss. The YOLOv7 models obtained the mAP of 91.03 % on the Weed25 dataset, which was used for training the model. The mAPs for the unseen datasets were 84.65 % and 81.16 %. As mentioned earlier, the classification accuracies with the best combination were 97.59 %, 93.67 % and 93.35 % for the Weed25, Cotton weed and Corn weed datasets, respectively. This study also compared the classification performance of our proposed technique with the state-of-the-art Convolutional Neural Network models. The proposed approach advances weed classification accuracy and presents a viable solution for dataset independent, i.e., site-independent weed detection, fostering sustainable agricultural practices.
基于形态的暹罗网络杂草类型识别
杂草自动检测和分类可以显著降低杂草管理成本,提高作物产量和品质。从图像中检测作物杂草本身就是一个具有挑战性的问题。因为杂草和农作物的颜色相似(绿色对绿色),所以它们的生长和质地有些相似;杂草也因作物、地理位置、季节甚至天气模式而异。本研究提出了一种利用对象检测和元学习技术进行广义杂草检测的新方法,超越了不同领域背景的限制。本研究不是按种类对杂草进行分类,而是根据与耕作方式相一致的形态科对杂草进行分类。目标检测器,例如YOLO (You Only Look Once)模型用于植物检测,而Siamese网络,利用最先进的深度学习模型作为其主干,用于杂草分类。本研究重新利用了三个公开可用的数据集,即Weed25、棉花杂草和玉米杂草数据。每个数据集包含多种杂草,而本研究根据杂草的形态将这些杂草分为三类。YOLOv7模型作为植物检测器效果最好,VGG16模型作为Siamese网络的特征提取器效果最好。此外,这些模型在一个数据集(Weed25)上进行了训练,并在没有进一步训练的情况下应用于其他数据集(棉花杂草和玉米杂草)。研究还发现,使用余弦相似度函数计算对比损失,提高了Siamese网络的分类精度。YOLOv7模型在Weed25数据集上获得了91.03 %的mAP,用于模型的训练。未见数据集的map分别为84.65 %和81.16 %。如前所述,在Weed25、棉花杂草和玉米杂草数据集上,最佳组合的分类准确率分别为97.59 %、93.67 %和93.35 %。本研究还比较了我们提出的技术与最先进的卷积神经网络模型的分类性能。该方法提高了杂草分类的准确性,并为独立于数据集(即独立于站点)的杂草检测提供了可行的解决方案,促进了可持续的农业实践。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
European Journal of Agronomy
European Journal of Agronomy 农林科学-农艺学
CiteScore
8.30
自引率
7.70%
发文量
187
审稿时长
4.5 months
期刊介绍: The European Journal of Agronomy, the official journal of the European Society for Agronomy, publishes original research papers reporting experimental and theoretical contributions to field-based agronomy and crop science. The journal will consider research at the field level for agricultural, horticultural and tree crops, that uses comprehensive and explanatory approaches. The EJA covers the following topics: crop physiology crop production and management including irrigation, fertilization and soil management agroclimatology and modelling plant-soil relationships crop quality and post-harvest physiology farming and cropping systems agroecosystems and the environment crop-weed interactions and management organic farming horticultural crops papers from the European Society for Agronomy bi-annual meetings In determining the suitability of submitted articles for publication, particular scrutiny is placed on the degree of novelty and significance of the research and the extent to which it adds to existing knowledge in agronomy.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信