{"title":"S2CPL: 用于选择性采收绿芦笋的新型采收评估和底土三维切点定位方法","authors":"","doi":"10.1016/j.compag.2024.109316","DOIUrl":null,"url":null,"abstract":"<div><p>Robotic selective harvesting is an ideal method for bionic manual harvest of green asparagus. However, the harvesting robot encounters difficulties in evaluating the suitable harvest due to the tilt and bending of the long stem, as well as determining the precise location of the subsoil cutting-point to prevent damage from bacteria on the cutting surface. This paper proposed the S2CPL model to address the challenges of the harvest evaluation and 3D localization of subsoil cutting-point for selective harvesting of green asparagus in field conditions. Firstly, an RGB-D sensor was used to acquire images and depth information of green asparaguses. Secondly, the improved YOLOv8 by introduced lightweight convolution and attention mechanisms in the feature fusion module to enhance the segmentation accuracy. Thirdly, a 3D morphology extraction method was proposed to calculate the length and diameter of green asparagus by utilizing the image mask fusion with depth information. Finally, harvest evaluation and subsoil 3D cutting-point location were achieved for robotic selective harvesting. In addition, the RGB-D sensor posture was optimized. The test results showed that the Intersection over Union (IoU) of green asparagus segmentation with S2CPL reaches 98.0 %, which outperforms YOLOv5 + uNet, YOLOv7 + uNet and YOLOv8-tiny by 5.60 %, 4.59 % and 1.34 % respectively. The average detection time per image was only 2.0 ms, and the GFLOPS was improved by 23.90 %, 88.49 % and 7.63 % compared with other models. The relative error of the length and diameter were less than 2.98 % and 2.15 %, respectively. The accuracy of location the subsoil cutting-point is more than 99.0 %, and the horizontal positioning error and depth positioning error of cutting-points were less than 6.0 mm and 7.4 mm. The proposed model is of strong robustness even dealing with partial occlusion and motion blur and is suitable with limited computing power to meet the needs of Robotic selective harvesting.</p></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":null,"pages":null},"PeriodicalIF":7.7000,"publicationDate":"2024-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"S2CPL: A novel method of the harvest evaluation and subsoil 3D cutting-Point location for selective harvesting of green asparagus\",\"authors\":\"\",\"doi\":\"10.1016/j.compag.2024.109316\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Robotic selective harvesting is an ideal method for bionic manual harvest of green asparagus. However, the harvesting robot encounters difficulties in evaluating the suitable harvest due to the tilt and bending of the long stem, as well as determining the precise location of the subsoil cutting-point to prevent damage from bacteria on the cutting surface. This paper proposed the S2CPL model to address the challenges of the harvest evaluation and 3D localization of subsoil cutting-point for selective harvesting of green asparagus in field conditions. Firstly, an RGB-D sensor was used to acquire images and depth information of green asparaguses. Secondly, the improved YOLOv8 by introduced lightweight convolution and attention mechanisms in the feature fusion module to enhance the segmentation accuracy. Thirdly, a 3D morphology extraction method was proposed to calculate the length and diameter of green asparagus by utilizing the image mask fusion with depth information. Finally, harvest evaluation and subsoil 3D cutting-point location were achieved for robotic selective harvesting. In addition, the RGB-D sensor posture was optimized. The test results showed that the Intersection over Union (IoU) of green asparagus segmentation with S2CPL reaches 98.0 %, which outperforms YOLOv5 + uNet, YOLOv7 + uNet and YOLOv8-tiny by 5.60 %, 4.59 % and 1.34 % respectively. The average detection time per image was only 2.0 ms, and the GFLOPS was improved by 23.90 %, 88.49 % and 7.63 % compared with other models. The relative error of the length and diameter were less than 2.98 % and 2.15 %, respectively. The accuracy of location the subsoil cutting-point is more than 99.0 %, and the horizontal positioning error and depth positioning error of cutting-points were less than 6.0 mm and 7.4 mm. The proposed model is of strong robustness even dealing with partial occlusion and motion blur and is suitable with limited computing power to meet the needs of Robotic selective harvesting.</p></div>\",\"PeriodicalId\":50627,\"journal\":{\"name\":\"Computers and Electronics in Agriculture\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":7.7000,\"publicationDate\":\"2024-08-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers and Electronics in Agriculture\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0168169924007075\",\"RegionNum\":1,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURE, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169924007075","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
S2CPL: A novel method of the harvest evaluation and subsoil 3D cutting-Point location for selective harvesting of green asparagus
Robotic selective harvesting is an ideal method for bionic manual harvest of green asparagus. However, the harvesting robot encounters difficulties in evaluating the suitable harvest due to the tilt and bending of the long stem, as well as determining the precise location of the subsoil cutting-point to prevent damage from bacteria on the cutting surface. This paper proposed the S2CPL model to address the challenges of the harvest evaluation and 3D localization of subsoil cutting-point for selective harvesting of green asparagus in field conditions. Firstly, an RGB-D sensor was used to acquire images and depth information of green asparaguses. Secondly, the improved YOLOv8 by introduced lightweight convolution and attention mechanisms in the feature fusion module to enhance the segmentation accuracy. Thirdly, a 3D morphology extraction method was proposed to calculate the length and diameter of green asparagus by utilizing the image mask fusion with depth information. Finally, harvest evaluation and subsoil 3D cutting-point location were achieved for robotic selective harvesting. In addition, the RGB-D sensor posture was optimized. The test results showed that the Intersection over Union (IoU) of green asparagus segmentation with S2CPL reaches 98.0 %, which outperforms YOLOv5 + uNet, YOLOv7 + uNet and YOLOv8-tiny by 5.60 %, 4.59 % and 1.34 % respectively. The average detection time per image was only 2.0 ms, and the GFLOPS was improved by 23.90 %, 88.49 % and 7.63 % compared with other models. The relative error of the length and diameter were less than 2.98 % and 2.15 %, respectively. The accuracy of location the subsoil cutting-point is more than 99.0 %, and the horizontal positioning error and depth positioning error of cutting-points were less than 6.0 mm and 7.4 mm. The proposed model is of strong robustness even dealing with partial occlusion and motion blur and is suitable with limited computing power to meet the needs of Robotic selective harvesting.
期刊介绍:
Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.