Md Zafar Iqbal , Robert G. Hardin , Joshua Peeples , Edward M. Barnes
{"title":"基于卷积神经网络(cnn)的圆棉模盖损伤检测","authors":"Md Zafar Iqbal , Robert G. Hardin , Joshua Peeples , Edward M. Barnes","doi":"10.1016/j.compag.2025.111023","DOIUrl":null,"url":null,"abstract":"<div><div>Plastic contamination in cotton threatens the economic viability and global reputation of US cotton. In the US, most contaminants likely originate from damaged plastic covers on round cotton modules, as loose pieces of cover can be torn and entangled in cotton by handling equipment. This study aimed to develop a robust convolutional neural network (CNN)-based detection model to identify cover damage on modules during handling, enabling necessary interventions to mitigate contamination. To achieve this objective, several models, including two-stage, one-stage, and detection transformers, were trained using images of modules with damaged covers. Following evaluation, the most effective model (YOLOv8l) was further optimized through pruning and fine-tuning, resulting in the proposed YOLOv8-wd model. This model achieved a mean average precision (mAP) of 92 % for detecting module cover damages, with an inference speed of 6.20 ms per image using sparse-aware engine. The proposed model demonstrated comparable accuracy to YOLOv8l while being 62.71 % lighter and 50.40 % faster. Model testing was conducted using images collected by a system installed on a module truck and a loader used for module handling at a fully operational gin and field. The loader handled 1,801 modules, capturing 6,935 images, while the truck handled 2,094 modules, yielding 32,584 images. From these images, YOLOv8-wd identified cover damage in 4.72 % of loader-handled and 3.92 % of truck-handled modules, though actual rates may be higher. Furthermore, using the model, the system provided clear status indicators (cover-damaged or undamaged) and unique ID’s for each module. The findings of this study could be used to reduce economic losses resulting from damaged covers of round cotton modules.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"239 ","pages":"Article 111023"},"PeriodicalIF":8.9000,"publicationDate":"2025-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Cover damage detection in round cotton modules using convolutional neural networks (CNNs)\",\"authors\":\"Md Zafar Iqbal , Robert G. Hardin , Joshua Peeples , Edward M. Barnes\",\"doi\":\"10.1016/j.compag.2025.111023\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Plastic contamination in cotton threatens the economic viability and global reputation of US cotton. In the US, most contaminants likely originate from damaged plastic covers on round cotton modules, as loose pieces of cover can be torn and entangled in cotton by handling equipment. This study aimed to develop a robust convolutional neural network (CNN)-based detection model to identify cover damage on modules during handling, enabling necessary interventions to mitigate contamination. To achieve this objective, several models, including two-stage, one-stage, and detection transformers, were trained using images of modules with damaged covers. Following evaluation, the most effective model (YOLOv8l) was further optimized through pruning and fine-tuning, resulting in the proposed YOLOv8-wd model. This model achieved a mean average precision (mAP) of 92 % for detecting module cover damages, with an inference speed of 6.20 ms per image using sparse-aware engine. The proposed model demonstrated comparable accuracy to YOLOv8l while being 62.71 % lighter and 50.40 % faster. Model testing was conducted using images collected by a system installed on a module truck and a loader used for module handling at a fully operational gin and field. The loader handled 1,801 modules, capturing 6,935 images, while the truck handled 2,094 modules, yielding 32,584 images. From these images, YOLOv8-wd identified cover damage in 4.72 % of loader-handled and 3.92 % of truck-handled modules, though actual rates may be higher. Furthermore, using the model, the system provided clear status indicators (cover-damaged or undamaged) and unique ID’s for each module. The findings of this study could be used to reduce economic losses resulting from damaged covers of round cotton modules.</div></div>\",\"PeriodicalId\":50627,\"journal\":{\"name\":\"Computers and Electronics in Agriculture\",\"volume\":\"239 \",\"pages\":\"Article 111023\"},\"PeriodicalIF\":8.9000,\"publicationDate\":\"2025-09-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers and Electronics in Agriculture\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0168169925011299\",\"RegionNum\":1,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURE, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169925011299","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
Cover damage detection in round cotton modules using convolutional neural networks (CNNs)
Plastic contamination in cotton threatens the economic viability and global reputation of US cotton. In the US, most contaminants likely originate from damaged plastic covers on round cotton modules, as loose pieces of cover can be torn and entangled in cotton by handling equipment. This study aimed to develop a robust convolutional neural network (CNN)-based detection model to identify cover damage on modules during handling, enabling necessary interventions to mitigate contamination. To achieve this objective, several models, including two-stage, one-stage, and detection transformers, were trained using images of modules with damaged covers. Following evaluation, the most effective model (YOLOv8l) was further optimized through pruning and fine-tuning, resulting in the proposed YOLOv8-wd model. This model achieved a mean average precision (mAP) of 92 % for detecting module cover damages, with an inference speed of 6.20 ms per image using sparse-aware engine. The proposed model demonstrated comparable accuracy to YOLOv8l while being 62.71 % lighter and 50.40 % faster. Model testing was conducted using images collected by a system installed on a module truck and a loader used for module handling at a fully operational gin and field. The loader handled 1,801 modules, capturing 6,935 images, while the truck handled 2,094 modules, yielding 32,584 images. From these images, YOLOv8-wd identified cover damage in 4.72 % of loader-handled and 3.92 % of truck-handled modules, though actual rates may be higher. Furthermore, using the model, the system provided clear status indicators (cover-damaged or undamaged) and unique ID’s for each module. The findings of this study could be used to reduce economic losses resulting from damaged covers of round cotton modules.
期刊介绍:
Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.