基于边缘平台的低成本轻量级深度学习模型的空中杂草检测

IF 1.2 4区 农林科学 Q3 AGRICULTURAL ENGINEERING
Nitin Rai, Xin Sun, C. Igathinathane, Kirk Howatt, Michael Ostlie
{"title":"基于边缘平台的低成本轻量级深度学习模型的空中杂草检测","authors":"Nitin Rai, Xin Sun, C. Igathinathane, Kirk Howatt, Michael Ostlie","doi":"10.13031/ja.15413","DOIUrl":null,"url":null,"abstract":"Highlights Lightweight deep learning models were trained on an edge device to identify weeds in aerial images. A customized configuration file was setup to train the models. These models were deployed to detect weeds in aerial images and videos (near real-time). CSPMobileNet-v2 and YOLOv4-lite are recommended models for weed detection using edge platform. Abstract. Deep learning (DL) techniques have proven to be a successful approach in detecting weeds for site-specific weed management (SSWM). In the past, most of the research work has trained and deployed pre-trained DL models on high-end systems coupled with expensive graphical processing units (GPUs). However, only a limited number of research studies have used DL models on an edge system for aerial-based weed detection. Therefore, while focusing on hardware cost minimization, eight DL models were trained and deployed on an edge device to detect weeds in aerial-image context and videos in this study. Four large models, namely CSPDarkNet-53, DarkNet-53, DenseNet-201, and ResNet-50, along with four lightweight models, CSPMobileNet-v2, YOLOv4-lite, EfficientNet-B0, and DarkNet-Ref, were considered for training a customized DL architecture. Along with trained model performance scores (average precision score, mean average precision (mAP), intersection over union, precision, and recall), other model metrics to assess edge system performance such as billion floating-point operations/s (BFLOPS), frame rates/s (FPS), and GPU memory usage were also estimated. The lightweight CSPMobileNet-v2 and YOLOv4-lite models outperformed others in detecting weeds in aerial image context. These models were able to achieve a mAP score of 83.2% and 82.2%, delivering an FPS of 60.9 and 61.1 during near real-time weed detection in aerial videos, respectively. The popular ResNet-50 model achieved a mAP of 79.6%, which was the highest amongst all the large models deployed for weed detection tasks. Based on the results, the two lightweight models, namely, CSPMobileNet-v2 and YOLOv4-lite, are recommended, and they can be used on a low-cost edge system to detect weeds in aerial image context with significant accuracy. Keywords: Aerial image, Deep learning, Edge device, Precision agriculture, Weed detection.","PeriodicalId":29714,"journal":{"name":"Journal of the ASABE","volume":null,"pages":null},"PeriodicalIF":1.2000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Aerial-Based Weed Detection Using Low-Cost and Lightweight Deep Learning Models on an Edge Platform\",\"authors\":\"Nitin Rai, Xin Sun, C. Igathinathane, Kirk Howatt, Michael Ostlie\",\"doi\":\"10.13031/ja.15413\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Highlights Lightweight deep learning models were trained on an edge device to identify weeds in aerial images. A customized configuration file was setup to train the models. These models were deployed to detect weeds in aerial images and videos (near real-time). CSPMobileNet-v2 and YOLOv4-lite are recommended models for weed detection using edge platform. Abstract. Deep learning (DL) techniques have proven to be a successful approach in detecting weeds for site-specific weed management (SSWM). In the past, most of the research work has trained and deployed pre-trained DL models on high-end systems coupled with expensive graphical processing units (GPUs). However, only a limited number of research studies have used DL models on an edge system for aerial-based weed detection. Therefore, while focusing on hardware cost minimization, eight DL models were trained and deployed on an edge device to detect weeds in aerial-image context and videos in this study. Four large models, namely CSPDarkNet-53, DarkNet-53, DenseNet-201, and ResNet-50, along with four lightweight models, CSPMobileNet-v2, YOLOv4-lite, EfficientNet-B0, and DarkNet-Ref, were considered for training a customized DL architecture. Along with trained model performance scores (average precision score, mean average precision (mAP), intersection over union, precision, and recall), other model metrics to assess edge system performance such as billion floating-point operations/s (BFLOPS), frame rates/s (FPS), and GPU memory usage were also estimated. The lightweight CSPMobileNet-v2 and YOLOv4-lite models outperformed others in detecting weeds in aerial image context. These models were able to achieve a mAP score of 83.2% and 82.2%, delivering an FPS of 60.9 and 61.1 during near real-time weed detection in aerial videos, respectively. The popular ResNet-50 model achieved a mAP of 79.6%, which was the highest amongst all the large models deployed for weed detection tasks. Based on the results, the two lightweight models, namely, CSPMobileNet-v2 and YOLOv4-lite, are recommended, and they can be used on a low-cost edge system to detect weeds in aerial image context with significant accuracy. Keywords: Aerial image, Deep learning, Edge device, Precision agriculture, Weed detection.\",\"PeriodicalId\":29714,\"journal\":{\"name\":\"Journal of the ASABE\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.2000,\"publicationDate\":\"2023-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of the ASABE\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.13031/ja.15413\",\"RegionNum\":4,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"AGRICULTURAL ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the ASABE","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.13031/ja.15413","RegionNum":4,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
引用次数: 1

摘要

在边缘设备上训练轻量级深度学习模型来识别航拍图像中的杂草。设置了一个定制的配置文件来训练模型。这些模型被用于检测航拍图像和视频中的杂草(接近实时)。CSPMobileNet-v2和YOLOv4-lite是边缘平台杂草检测的推荐模型。摘要深度学习(DL)技术已被证明是一种成功的杂草检测方法,用于特定地点的杂草管理(SSWM)。过去,大多数研究工作都是在高端系统上训练和部署预训练的深度学习模型,这些系统配备了昂贵的图形处理单元(gpu)。然而,只有有限的研究将深度学习模型用于边缘系统的空中杂草检测。因此,本研究在关注硬件成本最小化的同时,训练了8个深度学习模型并将其部署在边缘设备上,以检测航空图像上下文和视频中的杂草。四个大型模型,即CSPDarkNet-53, DarkNet-53, DenseNet-201和ResNet-50,以及四个轻量级模型,CSPMobileNet-v2, YOLOv4-lite, EfficientNet-B0和DarkNet-Ref,被考虑用于训练定制的DL架构。除了训练的模型性能分数(平均精度分数、平均平均精度(mAP)、交集/联合、精度和召回率)外,还估计了评估边缘系统性能的其他模型指标,如十亿浮点运算/秒(BFLOPS)、帧率/秒(FPS)和GPU内存使用情况。轻量级CSPMobileNet-v2和YOLOv4-lite模型在航空图像环境中检测杂草方面优于其他模型。这些模型能够实现83.2%和82.2%的mAP分数,在航拍视频的近实时杂草检测中分别提供60.9和61.1的FPS。流行的ResNet-50模型实现了79.6%的mAP,这是用于杂草检测任务的所有大型模型中最高的。在此基础上,推荐了CSPMobileNet-v2和YOLOv4-lite两种轻量级模型,它们可以在低成本的边缘系统上用于航拍图像背景下的杂草检测,并且精度很高。关键词:航拍图像,深度学习,边缘设备,精准农业,杂草检测
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Aerial-Based Weed Detection Using Low-Cost and Lightweight Deep Learning Models on an Edge Platform
Highlights Lightweight deep learning models were trained on an edge device to identify weeds in aerial images. A customized configuration file was setup to train the models. These models were deployed to detect weeds in aerial images and videos (near real-time). CSPMobileNet-v2 and YOLOv4-lite are recommended models for weed detection using edge platform. Abstract. Deep learning (DL) techniques have proven to be a successful approach in detecting weeds for site-specific weed management (SSWM). In the past, most of the research work has trained and deployed pre-trained DL models on high-end systems coupled with expensive graphical processing units (GPUs). However, only a limited number of research studies have used DL models on an edge system for aerial-based weed detection. Therefore, while focusing on hardware cost minimization, eight DL models were trained and deployed on an edge device to detect weeds in aerial-image context and videos in this study. Four large models, namely CSPDarkNet-53, DarkNet-53, DenseNet-201, and ResNet-50, along with four lightweight models, CSPMobileNet-v2, YOLOv4-lite, EfficientNet-B0, and DarkNet-Ref, were considered for training a customized DL architecture. Along with trained model performance scores (average precision score, mean average precision (mAP), intersection over union, precision, and recall), other model metrics to assess edge system performance such as billion floating-point operations/s (BFLOPS), frame rates/s (FPS), and GPU memory usage were also estimated. The lightweight CSPMobileNet-v2 and YOLOv4-lite models outperformed others in detecting weeds in aerial image context. These models were able to achieve a mAP score of 83.2% and 82.2%, delivering an FPS of 60.9 and 61.1 during near real-time weed detection in aerial videos, respectively. The popular ResNet-50 model achieved a mAP of 79.6%, which was the highest amongst all the large models deployed for weed detection tasks. Based on the results, the two lightweight models, namely, CSPMobileNet-v2 and YOLOv4-lite, are recommended, and they can be used on a low-cost edge system to detect weeds in aerial image context with significant accuracy. Keywords: Aerial image, Deep learning, Edge device, Precision agriculture, Weed detection.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
3.10
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信