Haoran Tan , Xueguan Zhao , Hao Fu , Minli Yang , Changyuan Zhai
{"title":"A novel fusion positioning navigation system for greenhouse strawberry spraying robot using LiDAR and ultrasonic tags","authors":"Haoran Tan , Xueguan Zhao , Hao Fu , Minli Yang , Changyuan Zhai","doi":"10.1016/j.agrcom.2025.100087","DOIUrl":null,"url":null,"abstract":"<div><div>The autonomous navigation methodology for greenhouse spraying robots improves operational efficiency and reduces human workload. However, navigation solutions based on Light Detection and Ranging (LiDAR) Simultaneous Localization and Mapping (SLAM) still face challenges such as mapping distortion caused by crop feature similarity, gradual accumulation of positioning errors, and positioning jumps, which fail to meet the positioning accuracy demands in agricultural robotic operations. This paper proposed an autonomous navigation methodology for greenhouse spraying robots that integrated three-dimensional (3D) LiDAR and ultrasonic tags into SLAM technology. The proposed approach generated a 3D point cloud map of the greenhouse environment through loosely coupled data fusion of a 3D LiDAR and an Inertial Measurement Unit (IMU). Robot relocalization and navigation trajectory recording utilized the pre-built point cloud map and ultrasonic tags. To further enhance positioning accuracy and robustness, a tightly-coupled framework combining LiDAR and ultrasonic tags was designed, incorporating an improved Iterative Closest Point (ICP) method and Singular Value Decomposition (SVD) algorithm for precise registration positioning. The SLAM mapping trajectories and navigation performance were validated in a standardized strawberry greenhouse. Results showed that at speeds of 0.2 m/s, 0.4 m/s, and 0.6 m/s, the maximum average absolute pose error between the positioning trajectory and the ground truth was 0.357 m, with a standard deviation of 0.148 m. Compared with the Cartographer and Tightly-coupled Lidar Inertial Odometry <em>via</em> Smoothing and Mapping (LIO-SAM) methods, the improved method reduced the average positioning error by 32.0 % and 14.0 %, respectively. Navigation tests demonstrated that the robot's maximum lateral error was 0.045 m, with a maximum average lateral positioning error of 0.022 m. These results confirm that the robot positioning and navigation accuracy satisfies the requirements for autonomous operations in greenhouse spraying, providing a reliable solution for autonomous navigation in structured agricultural environments.</div></div>","PeriodicalId":100065,"journal":{"name":"Agriculture Communications","volume":"3 2","pages":"Article 100087"},"PeriodicalIF":0.0000,"publicationDate":"2025-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Agriculture Communications","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2949798125000171","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The autonomous navigation methodology for greenhouse spraying robots improves operational efficiency and reduces human workload. However, navigation solutions based on Light Detection and Ranging (LiDAR) Simultaneous Localization and Mapping (SLAM) still face challenges such as mapping distortion caused by crop feature similarity, gradual accumulation of positioning errors, and positioning jumps, which fail to meet the positioning accuracy demands in agricultural robotic operations. This paper proposed an autonomous navigation methodology for greenhouse spraying robots that integrated three-dimensional (3D) LiDAR and ultrasonic tags into SLAM technology. The proposed approach generated a 3D point cloud map of the greenhouse environment through loosely coupled data fusion of a 3D LiDAR and an Inertial Measurement Unit (IMU). Robot relocalization and navigation trajectory recording utilized the pre-built point cloud map and ultrasonic tags. To further enhance positioning accuracy and robustness, a tightly-coupled framework combining LiDAR and ultrasonic tags was designed, incorporating an improved Iterative Closest Point (ICP) method and Singular Value Decomposition (SVD) algorithm for precise registration positioning. The SLAM mapping trajectories and navigation performance were validated in a standardized strawberry greenhouse. Results showed that at speeds of 0.2 m/s, 0.4 m/s, and 0.6 m/s, the maximum average absolute pose error between the positioning trajectory and the ground truth was 0.357 m, with a standard deviation of 0.148 m. Compared with the Cartographer and Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping (LIO-SAM) methods, the improved method reduced the average positioning error by 32.0 % and 14.0 %, respectively. Navigation tests demonstrated that the robot's maximum lateral error was 0.045 m, with a maximum average lateral positioning error of 0.022 m. These results confirm that the robot positioning and navigation accuracy satisfies the requirements for autonomous operations in greenhouse spraying, providing a reliable solution for autonomous navigation in structured agricultural environments.