Enhanced machine vision system for field-based detection of pickable strawberries: Integrating an advanced two-step deep learning model merging improved YOLOv8 and YOLOv5-cls
{"title":"Enhanced machine vision system for field-based detection of pickable strawberries: Integrating an advanced two-step deep learning model merging improved YOLOv8 and YOLOv5-cls","authors":"Zixuan He, Manoj Karkee, Qin Zhang","doi":"10.1016/j.compag.2025.110173","DOIUrl":null,"url":null,"abstract":"<div><div>In order to successfully deploy robotic harvesting in open field conditions, the development of an effective machine vision system becomes crucial. In this research, we proposed a novel two-step deep learning model consisting of a modified YOLOv8s and a YOLOv5s-cls to accomplish strawberry detection and pickability classification (whether a mature fruit is pickable by a robot). Firstly, the YOLOv8s was enhanced by incorporating C3x modules and an additional head network structure, specifically tailored for accurate strawberry detection. To further improve training performance, the <span><math><mi>α</mi></math></span>-IOU (intersection over union) technique was integrated. Subsequently, the YOLOv5s-cls was utilized to determine suitability of the detected mature strawberries. Through evaluations, Model D (+C3x+head+<span><math><mi>α</mi></math></span>IoU), which was a model based on modifying YOLOv8 using the new modules and techniques mentioned above, was found to perform the best among the tested models achieving the highest AP scores of 84.2% in Stage I (immature), 77.8% in Stage II (nearly mature), and 87.8% in Stage III (mature), along with the highest mAP of 83.2%. Overall, this modified model achieved a 2.5% improvement in mAP compared to the same achieved by original YOLOv8s model. Despite a slightly slower inference speed of 8.4 ms per image, Model D maintains real-time capabilities, making it an optimal choice for strawberry detection. Additionally, YOLOv5s-cls was identified as the preferred model for classifying mature strawberries into pickable and unpickable groups, offering a good inference speed of 2.8 ms per image and comparable accuracy with other compared models including YOLOv8s-cls, ResNet 18, EfficientNet-b0, and EfficientNet-b1. Finally, the combined two-step model developed in this study was evaluated in 10 different field scenarios from a completely different strawberry field that was not used in model training and initial testing. In this validation test the machine vision system achieved an AP of 89.0%, 82.0%, and 90.0% in detecting strawberries from Stage I, II, and III while the classification accuracy was 100.0% in unpickable group and 95.0% in pickable group. The results showed that the developed two-step machine vision system has a potential to improve the overall robotic harvesting system for strawberries grown in open-field conditions.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"234 ","pages":"Article 110173"},"PeriodicalIF":7.7000,"publicationDate":"2025-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169925002790","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
In order to successfully deploy robotic harvesting in open field conditions, the development of an effective machine vision system becomes crucial. In this research, we proposed a novel two-step deep learning model consisting of a modified YOLOv8s and a YOLOv5s-cls to accomplish strawberry detection and pickability classification (whether a mature fruit is pickable by a robot). Firstly, the YOLOv8s was enhanced by incorporating C3x modules and an additional head network structure, specifically tailored for accurate strawberry detection. To further improve training performance, the -IOU (intersection over union) technique was integrated. Subsequently, the YOLOv5s-cls was utilized to determine suitability of the detected mature strawberries. Through evaluations, Model D (+C3x+head+IoU), which was a model based on modifying YOLOv8 using the new modules and techniques mentioned above, was found to perform the best among the tested models achieving the highest AP scores of 84.2% in Stage I (immature), 77.8% in Stage II (nearly mature), and 87.8% in Stage III (mature), along with the highest mAP of 83.2%. Overall, this modified model achieved a 2.5% improvement in mAP compared to the same achieved by original YOLOv8s model. Despite a slightly slower inference speed of 8.4 ms per image, Model D maintains real-time capabilities, making it an optimal choice for strawberry detection. Additionally, YOLOv5s-cls was identified as the preferred model for classifying mature strawberries into pickable and unpickable groups, offering a good inference speed of 2.8 ms per image and comparable accuracy with other compared models including YOLOv8s-cls, ResNet 18, EfficientNet-b0, and EfficientNet-b1. Finally, the combined two-step model developed in this study was evaluated in 10 different field scenarios from a completely different strawberry field that was not used in model training and initial testing. In this validation test the machine vision system achieved an AP of 89.0%, 82.0%, and 90.0% in detecting strawberries from Stage I, II, and III while the classification accuracy was 100.0% in unpickable group and 95.0% in pickable group. The results showed that the developed two-step machine vision system has a potential to improve the overall robotic harvesting system for strawberries grown in open-field conditions.
期刊介绍:
Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.