Xia Chuang, Chen Qiang, Shi Yinyan, Wang Xiaochan, Zhang Xiaolei, Wu Yao, Wang Yiran
{"title":"Improved lightweight YOLOv5n-based network for bruise detection and length classification of asparagus","authors":"Xia Chuang, Chen Qiang, Shi Yinyan, Wang Xiaochan, Zhang Xiaolei, Wu Yao, Wang Yiran","doi":"10.1016/j.compag.2025.110194","DOIUrl":null,"url":null,"abstract":"<div><div>Efficient and accurate online quality recognition is crucial for asparagus production. To address the slow manual detection speed and low recognition efficiency in asparagus grading, as well as the limitations of traditional single-label classification algorithms in identifying bruise locations and classifying lengths, this study proposes an asparagus length classification and bruise detection method using the You-Only-Look-Once (YOLOv5n) convolutional neural network. Two lightweight, improved models—YOLOv5n with a spatial grouping strategy (YOLOv5n-SGS) and YOLOv5n with a global enhancement strategy (YOLOv5n-GES)—were connected in series to perform length classification and bruise detection of asparagus. The YOLOv5n-SGS model incorporated the ShuffleNet backbone, ghost spatial convolution (GSConv), and VoVNet ghost spatial cross-stage partial (VoV-GSCSP) modules, reducing computational complexity while improving efficiency in length classification. The YOLOv5n-GES model integrated the GhostNet backbone, GhostConv, and C3Ghost modules to enhance detection speed. A simple parameter-free attention module (SimAM) was added to improve semantic feature extraction, while an efficient intersection over union (EIoU) loss function was employed to enhance convergence and recognition accuracy. Test results demonstrated that the YOLOv5n-SGS model achieved a mean average precision (mAP) of 96.5 %, reducing computational complexity to 10 % of that of YOLOv5n. The YOLOv5n-GES model attained an mAP of 97.3 %, with complexity reduced to 59 %. The combined performance of both models outperformed similar approaches. By connecting both models with a cropping layer, overall classification and detection accuracy exceeded 95 % with total number of parameters only 1.221 M. The proposed method significantly improves high-precision length classification and bruise detection in asparagus, advancing industrial automation and enhancing the economic value of asparagus production.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"233 ","pages":"Article 110194"},"PeriodicalIF":7.7000,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S016816992500300X","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Efficient and accurate online quality recognition is crucial for asparagus production. To address the slow manual detection speed and low recognition efficiency in asparagus grading, as well as the limitations of traditional single-label classification algorithms in identifying bruise locations and classifying lengths, this study proposes an asparagus length classification and bruise detection method using the You-Only-Look-Once (YOLOv5n) convolutional neural network. Two lightweight, improved models—YOLOv5n with a spatial grouping strategy (YOLOv5n-SGS) and YOLOv5n with a global enhancement strategy (YOLOv5n-GES)—were connected in series to perform length classification and bruise detection of asparagus. The YOLOv5n-SGS model incorporated the ShuffleNet backbone, ghost spatial convolution (GSConv), and VoVNet ghost spatial cross-stage partial (VoV-GSCSP) modules, reducing computational complexity while improving efficiency in length classification. The YOLOv5n-GES model integrated the GhostNet backbone, GhostConv, and C3Ghost modules to enhance detection speed. A simple parameter-free attention module (SimAM) was added to improve semantic feature extraction, while an efficient intersection over union (EIoU) loss function was employed to enhance convergence and recognition accuracy. Test results demonstrated that the YOLOv5n-SGS model achieved a mean average precision (mAP) of 96.5 %, reducing computational complexity to 10 % of that of YOLOv5n. The YOLOv5n-GES model attained an mAP of 97.3 %, with complexity reduced to 59 %. The combined performance of both models outperformed similar approaches. By connecting both models with a cropping layer, overall classification and detection accuracy exceeded 95 % with total number of parameters only 1.221 M. The proposed method significantly improves high-precision length classification and bruise detection in asparagus, advancing industrial automation and enhancing the economic value of asparagus production.
期刊介绍:
Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.