Xiong Yin , Lin Yang , Daojin Yao , Xin Yang , Yinbing Bian , Yuhua Gong
{"title":"Improved DeepLabV3+ and GR-ConvNet for shiitake mushroom harvest robots flexible grasping of mimicry","authors":"Xiong Yin , Lin Yang , Daojin Yao , Xin Yang , Yinbing Bian , Yuhua Gong","doi":"10.1016/j.compag.2025.110449","DOIUrl":null,"url":null,"abstract":"<div><div>Mimicry crops, like shiitake mushrooms, pose challenges for agricultural robot and grasping algorithms, due to their unique traits of textures and colors similar with surroundings, crowding, and crushable of mushroom fruits. To tackle this and getting hints from human harvest operations, we devised an innovative solution of flexible grasping of mimicry based on visual semantics. First, we captured data, building a 5,000-photo Shiitake mushroom dataset. We replaced DeepLabV3+’s backbone with MobileNeV3 to boost computational efficiency of agricultural robot, integrated DWM for enhanced feature extraction and fusion of Shiitake mushroom. In pre-processing, the AECA module was proposed for more accurate feature representation of Shiitake mushroom. The new MFM module inserted into GR-Conet improved the model’s sensitivity to Shiitake mushroom features. Ghost Conv replaced original Conv, generating more feature maps cheaply. A model predictive admittance control algorithm is inserted into the robotic arm. It discerns pickers’ force intention via the admittance model, generating agricultural robot arm’s motion trajectory. Leveraging model predictive control’s features, it enhances control robustness. Our framework combines optimized models to form a grip detection architecture. Evaluated on the Shiitake mushroom dataset, the improved DeepLabV3+ has 95.8 % accuracy, GR-ConvNet hits 98.8 %, and agricultural robot arm’s actual grasp accuracy is 90.5 %, 4.0 % higher than traditional methods. The model predictive control algorithm also realizes better trajectory tracking and compliance, proving the improved algorithms effectively handle mimicry object challenges.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"236 ","pages":"Article 110449"},"PeriodicalIF":8.9000,"publicationDate":"2025-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169925005551","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Mimicry crops, like shiitake mushrooms, pose challenges for agricultural robot and grasping algorithms, due to their unique traits of textures and colors similar with surroundings, crowding, and crushable of mushroom fruits. To tackle this and getting hints from human harvest operations, we devised an innovative solution of flexible grasping of mimicry based on visual semantics. First, we captured data, building a 5,000-photo Shiitake mushroom dataset. We replaced DeepLabV3+’s backbone with MobileNeV3 to boost computational efficiency of agricultural robot, integrated DWM for enhanced feature extraction and fusion of Shiitake mushroom. In pre-processing, the AECA module was proposed for more accurate feature representation of Shiitake mushroom. The new MFM module inserted into GR-Conet improved the model’s sensitivity to Shiitake mushroom features. Ghost Conv replaced original Conv, generating more feature maps cheaply. A model predictive admittance control algorithm is inserted into the robotic arm. It discerns pickers’ force intention via the admittance model, generating agricultural robot arm’s motion trajectory. Leveraging model predictive control’s features, it enhances control robustness. Our framework combines optimized models to form a grip detection architecture. Evaluated on the Shiitake mushroom dataset, the improved DeepLabV3+ has 95.8 % accuracy, GR-ConvNet hits 98.8 %, and agricultural robot arm’s actual grasp accuracy is 90.5 %, 4.0 % higher than traditional methods. The model predictive control algorithm also realizes better trajectory tracking and compliance, proving the improved algorithms effectively handle mimicry object challenges.
期刊介绍:
Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.