Cover damage detection in round cotton modules using convolutional neural networks (CNNs)

IF 8.9 1区 农林科学 Q1 AGRICULTURE, MULTIDISCIPLINARY
Md Zafar Iqbal , Robert G. Hardin , Joshua Peeples , Edward M. Barnes
{"title":"Cover damage detection in round cotton modules using convolutional neural networks (CNNs)","authors":"Md Zafar Iqbal ,&nbsp;Robert G. Hardin ,&nbsp;Joshua Peeples ,&nbsp;Edward M. Barnes","doi":"10.1016/j.compag.2025.111023","DOIUrl":null,"url":null,"abstract":"<div><div>Plastic contamination in cotton threatens the economic viability and global reputation of US cotton. In the US, most contaminants likely originate from damaged plastic covers on round cotton modules, as loose pieces of cover can be torn and entangled in cotton by handling equipment. This study aimed to develop a robust convolutional neural network (CNN)-based detection model to identify cover damage on modules during handling, enabling necessary interventions to mitigate contamination. To achieve this objective, several models, including two-stage, one-stage, and detection transformers, were trained using images of modules with damaged covers. Following evaluation, the most effective model (YOLOv8l) was further optimized through pruning and fine-tuning, resulting in the proposed YOLOv8-wd model. This model achieved a mean average precision (mAP) of 92 % for detecting module cover damages, with an inference speed of 6.20 ms per image using sparse-aware engine. The proposed model demonstrated comparable accuracy to YOLOv8l while being 62.71 % lighter and 50.40 % faster. Model testing was conducted using images collected by a system installed on a module truck and a loader used for module handling at a fully operational gin and field. The loader handled 1,801 modules, capturing 6,935 images, while the truck handled 2,094 modules, yielding 32,584 images. From these images, YOLOv8-wd identified cover damage in 4.72 % of loader-handled and 3.92 % of truck-handled modules, though actual rates may be higher. Furthermore, using the model, the system provided clear status indicators (cover-damaged or undamaged) and unique ID’s for each module. The findings of this study could be used to reduce economic losses resulting from damaged covers of round cotton modules.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"239 ","pages":"Article 111023"},"PeriodicalIF":8.9000,"publicationDate":"2025-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169925011299","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

Plastic contamination in cotton threatens the economic viability and global reputation of US cotton. In the US, most contaminants likely originate from damaged plastic covers on round cotton modules, as loose pieces of cover can be torn and entangled in cotton by handling equipment. This study aimed to develop a robust convolutional neural network (CNN)-based detection model to identify cover damage on modules during handling, enabling necessary interventions to mitigate contamination. To achieve this objective, several models, including two-stage, one-stage, and detection transformers, were trained using images of modules with damaged covers. Following evaluation, the most effective model (YOLOv8l) was further optimized through pruning and fine-tuning, resulting in the proposed YOLOv8-wd model. This model achieved a mean average precision (mAP) of 92 % for detecting module cover damages, with an inference speed of 6.20 ms per image using sparse-aware engine. The proposed model demonstrated comparable accuracy to YOLOv8l while being 62.71 % lighter and 50.40 % faster. Model testing was conducted using images collected by a system installed on a module truck and a loader used for module handling at a fully operational gin and field. The loader handled 1,801 modules, capturing 6,935 images, while the truck handled 2,094 modules, yielding 32,584 images. From these images, YOLOv8-wd identified cover damage in 4.72 % of loader-handled and 3.92 % of truck-handled modules, though actual rates may be higher. Furthermore, using the model, the system provided clear status indicators (cover-damaged or undamaged) and unique ID’s for each module. The findings of this study could be used to reduce economic losses resulting from damaged covers of round cotton modules.
基于卷积神经网络(cnn)的圆棉模盖损伤检测
棉花中的塑料污染威胁着美国棉花的经济生存能力和全球声誉。在美国,大多数污染物可能来自圆形棉花模块上损坏的塑料覆盖物,因为松散的覆盖物可能在搬运设备时被撕裂并缠在棉花上。本研究旨在开发一种基于卷积神经网络(CNN)的鲁棒检测模型,以识别处理过程中模块盖板的损坏情况,从而采取必要的干预措施来减轻污染。为了实现这一目标,我们使用了几个模型,包括两级、一级和检测变压器,这些模型使用了受损模块的图像进行训练。经过评价,对最有效的模型(YOLOv8l)进行了进一步的剪枝和微调优化,得到了提出的YOLOv8-wd模型。使用稀疏感知引擎,该模型在检测模块覆盖损伤方面达到了92%的平均精度(mAP),每幅图像的推理速度为6.20 ms。该模型的精度与YOLOv8l相当,同时重量减轻了62.71%,速度提高了50.40%。模型测试使用安装在模块卡车和装载机上的系统收集的图像进行,装载机用于在完全运行的gin和现场处理模块。装载机处理了1801个模块,拍摄了6935张图像,而卡车处理了2094个模块,拍摄了32584张图像。从这些图像中,YOLOv8-wd识别出4.72%的装载机处理和3.92%的卡车处理模块的盖板损坏,尽管实际比率可能更高。此外,使用该模型,系统为每个模块提供了清晰的状态指示器(盖板损坏或未损坏)和唯一的ID。本研究结果可用于减少圆形棉花模套受损造成的经济损失。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Computers and Electronics in Agriculture
Computers and Electronics in Agriculture 工程技术-计算机:跨学科应用
CiteScore
15.30
自引率
14.50%
发文量
800
审稿时长
62 days
期刊介绍: Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信