使用基于多级深度学习的解决方案估算田间跳甲危害情况

IF 8.2 Q1 AGRICULTURE, MULTIDISCIPLINARY
Arantza Bereciartua-Pérez , María Monzón , Daniel Múgica , Greta De Both , Jeroen Baert , Brittany Hedges , Nicole Fox , Jone Echazarra , Ramón Navarra-Mestre
{"title":"使用基于多级深度学习的解决方案估算田间跳甲危害情况","authors":"Arantza Bereciartua-Pérez ,&nbsp;María Monzón ,&nbsp;Daniel Múgica ,&nbsp;Greta De Both ,&nbsp;Jeroen Baert ,&nbsp;Brittany Hedges ,&nbsp;Nicole Fox ,&nbsp;Jone Echazarra ,&nbsp;Ramón Navarra-Mestre","doi":"10.1016/j.aiia.2024.06.001","DOIUrl":null,"url":null,"abstract":"<div><p>Estimation of damage in plants is a key issue for crop protection. Currently, experts in the field manually assess the plots. This is a time-consuming task that can be automated thanks to the latest technology in computer vision (CV). The use of image-based systems and recently deep learning-based systems have provided good results in several agricultural applications. These image-based applications outperform expert evaluation in controlled environments, and now they are being progressively included in non-controlled field applications.</p><p>A novel solution based on deep learning techniques in combination with image processing methods is proposed to tackle the estimate of plant damage in the field. The proposed solution is a two-stage algorithm. In a first stage, the single plants in the plots are detected by an object detection YOLO based model. Then a regression model is applied to estimate the damage of each individual plant. The solution has been developed and validated in oilseed rape plants to estimate the damage caused by flea beetle.</p><p>The crop detection model achieves a mean precision average of 91% with a [email protected] of 0.99 and a [email protected] of 0.91 for oilseed rape specifically. The regression model to estimate up to 60% of damage degree in single plants achieves a MAE of 7.11, and R2 of 0.46 in comparison with manual evaluations done plant by plant by experts. Models are deployed in a docker, and with a REST API communication protocol they can be inferred directly for images acquired in the field from a mobile device.</p></div>","PeriodicalId":52814,"journal":{"name":"Artificial Intelligence in Agriculture","volume":null,"pages":null},"PeriodicalIF":8.2000,"publicationDate":"2024-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2589721724000199/pdfft?md5=6734d348bce39475c37cb2c23f24a354&pid=1-s2.0-S2589721724000199-main.pdf","citationCount":"0","resultStr":"{\"title\":\"Estimation of flea beetle damage in the field using a multistage deep learning-based solution\",\"authors\":\"Arantza Bereciartua-Pérez ,&nbsp;María Monzón ,&nbsp;Daniel Múgica ,&nbsp;Greta De Both ,&nbsp;Jeroen Baert ,&nbsp;Brittany Hedges ,&nbsp;Nicole Fox ,&nbsp;Jone Echazarra ,&nbsp;Ramón Navarra-Mestre\",\"doi\":\"10.1016/j.aiia.2024.06.001\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Estimation of damage in plants is a key issue for crop protection. Currently, experts in the field manually assess the plots. This is a time-consuming task that can be automated thanks to the latest technology in computer vision (CV). The use of image-based systems and recently deep learning-based systems have provided good results in several agricultural applications. These image-based applications outperform expert evaluation in controlled environments, and now they are being progressively included in non-controlled field applications.</p><p>A novel solution based on deep learning techniques in combination with image processing methods is proposed to tackle the estimate of plant damage in the field. The proposed solution is a two-stage algorithm. In a first stage, the single plants in the plots are detected by an object detection YOLO based model. Then a regression model is applied to estimate the damage of each individual plant. The solution has been developed and validated in oilseed rape plants to estimate the damage caused by flea beetle.</p><p>The crop detection model achieves a mean precision average of 91% with a [email protected] of 0.99 and a [email protected] of 0.91 for oilseed rape specifically. The regression model to estimate up to 60% of damage degree in single plants achieves a MAE of 7.11, and R2 of 0.46 in comparison with manual evaluations done plant by plant by experts. Models are deployed in a docker, and with a REST API communication protocol they can be inferred directly for images acquired in the field from a mobile device.</p></div>\",\"PeriodicalId\":52814,\"journal\":{\"name\":\"Artificial Intelligence in Agriculture\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":8.2000,\"publicationDate\":\"2024-06-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S2589721724000199/pdfft?md5=6734d348bce39475c37cb2c23f24a354&pid=1-s2.0-S2589721724000199-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Artificial Intelligence in Agriculture\",\"FirstCategoryId\":\"1087\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2589721724000199\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURE, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Artificial Intelligence in Agriculture","FirstCategoryId":"1087","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2589721724000199","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

摘要

估算植物受损情况是作物保护的一个关键问题。目前,田间专家需要对地块进行人工评估。这是一项耗时的工作,但借助计算机视觉(CV)领域的最新技术,这项工作可以实现自动化。基于图像的系统以及最近基于深度学习的系统在一些农业应用中取得了良好的效果。这些基于图像的应用在受控环境中的表现优于专家评估,现在它们正逐渐被纳入非受控田间应用中。所提出的解决方案是一种两阶段算法。在第一阶段,通过基于对象检测 YOLO 的模型检测地块中的单株植物。然后应用回归模型来估算每棵单株植物的损害程度。作物检测模型的平均精确度达到 91%,[email protected] 为 0.99,油菜的[email protected] 为 0.91。回归模型可估算单株植物 60% 的损害程度,与专家逐株进行的人工评估相比,其 MAE 为 7.11,R2 为 0.46。模型部署在 docker 中,通过 REST API 通信协议,可以直接推断出移动设备在田间获取的图像。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Estimation of flea beetle damage in the field using a multistage deep learning-based solution

Estimation of damage in plants is a key issue for crop protection. Currently, experts in the field manually assess the plots. This is a time-consuming task that can be automated thanks to the latest technology in computer vision (CV). The use of image-based systems and recently deep learning-based systems have provided good results in several agricultural applications. These image-based applications outperform expert evaluation in controlled environments, and now they are being progressively included in non-controlled field applications.

A novel solution based on deep learning techniques in combination with image processing methods is proposed to tackle the estimate of plant damage in the field. The proposed solution is a two-stage algorithm. In a first stage, the single plants in the plots are detected by an object detection YOLO based model. Then a regression model is applied to estimate the damage of each individual plant. The solution has been developed and validated in oilseed rape plants to estimate the damage caused by flea beetle.

The crop detection model achieves a mean precision average of 91% with a [email protected] of 0.99 and a [email protected] of 0.91 for oilseed rape specifically. The regression model to estimate up to 60% of damage degree in single plants achieves a MAE of 7.11, and R2 of 0.46 in comparison with manual evaluations done plant by plant by experts. Models are deployed in a docker, and with a REST API communication protocol they can be inferred directly for images acquired in the field from a mobile device.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Artificial Intelligence in Agriculture
Artificial Intelligence in Agriculture Engineering-Engineering (miscellaneous)
CiteScore
21.60
自引率
0.00%
发文量
18
审稿时长
12 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信