Zikang Zhang , Zhengda Li , Meng Yang , Jiale Cui , Yang Shao , Youchun Ding , Wanneng Yang , Wen Qiao , Peng Song
{"title":"基于地空协同的现场分型机器人自主导航方法","authors":"Zikang Zhang , Zhengda Li , Meng Yang , Jiale Cui , Yang Shao , Youchun Ding , Wanneng Yang , Wen Qiao , Peng Song","doi":"10.1016/j.aiia.2025.05.005","DOIUrl":null,"url":null,"abstract":"<div><div>High-throughput phenotyping collection technology is important in affecting the efficiency of crop breeding. This study introduces a novel autonomous navigation method for phenotyping robots that leverages ground-air collaboration to meet the demands of unmanned crop phenotypic data collection. The proposed method employs a UAV equipped with a Real-Time Kinematic (RTK) module for the construction of high-precision Field maps. It utilizes SegFormor-B0 semantic segmentation models to detect crop rows, and extracts key coordinate points of these rows, and generates navigation paths for the phenotyping robots by mapping these points to actual geographic coordinates. Furthermore, an adaptive controller based on the Pure Pursuit algorithm is proposed, which dynamically adjusts the steering angle of the phenotyping robot in real-time, according to the distance (<span><math><mi>d</mi></math></span>), angular deviation (<span><math><mi>α</mi></math></span>) and the lateral deviation (<span><math><msub><mi>e</mi><mi>y</mi></msub></math></span>) between the robot's current position and its target position. This enables the robot to accurately trace paths in field environments. The results demonstrate that the mean absolute error (MAE) of the proposed method in extracting the centerline of potted plants area's rows is 2.83 cm, and the cropland's rows is 4.51 cm. The majority of global path tracking errors stay within 2 cm. In the potted plants area, 99.1 % of errors lie within this range, with a mean absolute error of 0.62 cm and a maximum error of 2.59 cm. In the cropland, 72.4 % of errors remain within this range, with a mean absolute error of 1.51 cm and a maximum error of 4.22 cm. Compared with traditional GNSS-based navigation methods and single vision methods, this method shows significant advantages in adapting to the dynamic growth of crops and complex field environments, which not only ensures that the phenotyping robot accurately travels along the crop rows during field operations to avoid damage to the crops, but also provides an efficient and accurate means of data acquisition for crop phenotyping.</div></div>","PeriodicalId":52814,"journal":{"name":"Artificial Intelligence in Agriculture","volume":"15 4","pages":"Pages 610-621"},"PeriodicalIF":8.2000,"publicationDate":"2025-05-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An autonomous navigation method for field phenotyping robot based on ground-air collaboration\",\"authors\":\"Zikang Zhang , Zhengda Li , Meng Yang , Jiale Cui , Yang Shao , Youchun Ding , Wanneng Yang , Wen Qiao , Peng Song\",\"doi\":\"10.1016/j.aiia.2025.05.005\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>High-throughput phenotyping collection technology is important in affecting the efficiency of crop breeding. This study introduces a novel autonomous navigation method for phenotyping robots that leverages ground-air collaboration to meet the demands of unmanned crop phenotypic data collection. The proposed method employs a UAV equipped with a Real-Time Kinematic (RTK) module for the construction of high-precision Field maps. It utilizes SegFormor-B0 semantic segmentation models to detect crop rows, and extracts key coordinate points of these rows, and generates navigation paths for the phenotyping robots by mapping these points to actual geographic coordinates. Furthermore, an adaptive controller based on the Pure Pursuit algorithm is proposed, which dynamically adjusts the steering angle of the phenotyping robot in real-time, according to the distance (<span><math><mi>d</mi></math></span>), angular deviation (<span><math><mi>α</mi></math></span>) and the lateral deviation (<span><math><msub><mi>e</mi><mi>y</mi></msub></math></span>) between the robot's current position and its target position. This enables the robot to accurately trace paths in field environments. The results demonstrate that the mean absolute error (MAE) of the proposed method in extracting the centerline of potted plants area's rows is 2.83 cm, and the cropland's rows is 4.51 cm. The majority of global path tracking errors stay within 2 cm. In the potted plants area, 99.1 % of errors lie within this range, with a mean absolute error of 0.62 cm and a maximum error of 2.59 cm. In the cropland, 72.4 % of errors remain within this range, with a mean absolute error of 1.51 cm and a maximum error of 4.22 cm. Compared with traditional GNSS-based navigation methods and single vision methods, this method shows significant advantages in adapting to the dynamic growth of crops and complex field environments, which not only ensures that the phenotyping robot accurately travels along the crop rows during field operations to avoid damage to the crops, but also provides an efficient and accurate means of data acquisition for crop phenotyping.</div></div>\",\"PeriodicalId\":52814,\"journal\":{\"name\":\"Artificial Intelligence in Agriculture\",\"volume\":\"15 4\",\"pages\":\"Pages 610-621\"},\"PeriodicalIF\":8.2000,\"publicationDate\":\"2025-05-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Artificial Intelligence in Agriculture\",\"FirstCategoryId\":\"1087\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2589721725000601\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURE, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Artificial Intelligence in Agriculture","FirstCategoryId":"1087","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2589721725000601","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
An autonomous navigation method for field phenotyping robot based on ground-air collaboration
High-throughput phenotyping collection technology is important in affecting the efficiency of crop breeding. This study introduces a novel autonomous navigation method for phenotyping robots that leverages ground-air collaboration to meet the demands of unmanned crop phenotypic data collection. The proposed method employs a UAV equipped with a Real-Time Kinematic (RTK) module for the construction of high-precision Field maps. It utilizes SegFormor-B0 semantic segmentation models to detect crop rows, and extracts key coordinate points of these rows, and generates navigation paths for the phenotyping robots by mapping these points to actual geographic coordinates. Furthermore, an adaptive controller based on the Pure Pursuit algorithm is proposed, which dynamically adjusts the steering angle of the phenotyping robot in real-time, according to the distance (), angular deviation () and the lateral deviation () between the robot's current position and its target position. This enables the robot to accurately trace paths in field environments. The results demonstrate that the mean absolute error (MAE) of the proposed method in extracting the centerline of potted plants area's rows is 2.83 cm, and the cropland's rows is 4.51 cm. The majority of global path tracking errors stay within 2 cm. In the potted plants area, 99.1 % of errors lie within this range, with a mean absolute error of 0.62 cm and a maximum error of 2.59 cm. In the cropland, 72.4 % of errors remain within this range, with a mean absolute error of 1.51 cm and a maximum error of 4.22 cm. Compared with traditional GNSS-based navigation methods and single vision methods, this method shows significant advantages in adapting to the dynamic growth of crops and complex field environments, which not only ensures that the phenotyping robot accurately travels along the crop rows during field operations to avoid damage to the crops, but also provides an efficient and accurate means of data acquisition for crop phenotyping.