{"title":"面向农业场景定位的GNSS/IMU/vision紧密耦合集成系统","authors":"Jiayuan Yu , Hui Fang , Xiya Zhang , Wentao Wu , Yong He","doi":"10.1016/j.compag.2025.110478","DOIUrl":null,"url":null,"abstract":"<div><div>The traditional satellite-based positioning technology is widely used in the field of automatic navigation of agricultural machinery, but the meter-level accuracy of the Global Navigation Satellite System (GNSS) single-point positioning does not meet the needs of agriculture. Although the Real-Time Kinematic (RTK) carrier-phase differential positioning technology can achieve centimeter-level accuracy, it is associated with high costs. Compared with the GNSS-only positioning, multi-sensor fusion, which integrates cameras, Inertial Measurement Units (IMU), and other inexpensive sensors, offers significant advantages. This paper proposes a low-cost multi-sensor fusion system suitable for positioning in agricultural scenarios, which tightly couples information from GNSS, IMU, and vision. The raw data from each sensor are preprocessed, including feature tracking, IMU pre-integration, and screening of unhealthy, low-elevation satellites. A factor graph-based optimization model is developed to derive a drift-free global trajectory estimation. The factor graph incorporates visual, IMU, code pseudo-range, Doppler, and clock factors. To adapt to agricultural environments, two improvements to the visual factor are made. We introduce an IMU-assisted optical flow method to mitigate the impact of dynamic noise, such as wind-blown crops and pedestrians, on feature tracking. Additionally, we eliminate the inverse depth state quantity of the far-away feature points during the factor graph optimization process to reduce translation and scale errors introduced in the pose optimization. The system was tested in three agricultural scenarios, yielding results that demonstrated the ability of the approach to achieve an absolute localization accuracy of 0.87 m and a relative localization accuracy of 0.090 m.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"239 ","pages":"Article 110478"},"PeriodicalIF":8.9000,"publicationDate":"2025-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Tightly coupled GNSS/IMU/vision integrated system for positioning in agricultural scenarios\",\"authors\":\"Jiayuan Yu , Hui Fang , Xiya Zhang , Wentao Wu , Yong He\",\"doi\":\"10.1016/j.compag.2025.110478\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>The traditional satellite-based positioning technology is widely used in the field of automatic navigation of agricultural machinery, but the meter-level accuracy of the Global Navigation Satellite System (GNSS) single-point positioning does not meet the needs of agriculture. Although the Real-Time Kinematic (RTK) carrier-phase differential positioning technology can achieve centimeter-level accuracy, it is associated with high costs. Compared with the GNSS-only positioning, multi-sensor fusion, which integrates cameras, Inertial Measurement Units (IMU), and other inexpensive sensors, offers significant advantages. This paper proposes a low-cost multi-sensor fusion system suitable for positioning in agricultural scenarios, which tightly couples information from GNSS, IMU, and vision. The raw data from each sensor are preprocessed, including feature tracking, IMU pre-integration, and screening of unhealthy, low-elevation satellites. A factor graph-based optimization model is developed to derive a drift-free global trajectory estimation. The factor graph incorporates visual, IMU, code pseudo-range, Doppler, and clock factors. To adapt to agricultural environments, two improvements to the visual factor are made. We introduce an IMU-assisted optical flow method to mitigate the impact of dynamic noise, such as wind-blown crops and pedestrians, on feature tracking. Additionally, we eliminate the inverse depth state quantity of the far-away feature points during the factor graph optimization process to reduce translation and scale errors introduced in the pose optimization. The system was tested in three agricultural scenarios, yielding results that demonstrated the ability of the approach to achieve an absolute localization accuracy of 0.87 m and a relative localization accuracy of 0.090 m.</div></div>\",\"PeriodicalId\":50627,\"journal\":{\"name\":\"Computers and Electronics in Agriculture\",\"volume\":\"239 \",\"pages\":\"Article 110478\"},\"PeriodicalIF\":8.9000,\"publicationDate\":\"2025-09-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers and Electronics in Agriculture\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0168169925005848\",\"RegionNum\":1,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURE, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169925005848","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
Tightly coupled GNSS/IMU/vision integrated system for positioning in agricultural scenarios
The traditional satellite-based positioning technology is widely used in the field of automatic navigation of agricultural machinery, but the meter-level accuracy of the Global Navigation Satellite System (GNSS) single-point positioning does not meet the needs of agriculture. Although the Real-Time Kinematic (RTK) carrier-phase differential positioning technology can achieve centimeter-level accuracy, it is associated with high costs. Compared with the GNSS-only positioning, multi-sensor fusion, which integrates cameras, Inertial Measurement Units (IMU), and other inexpensive sensors, offers significant advantages. This paper proposes a low-cost multi-sensor fusion system suitable for positioning in agricultural scenarios, which tightly couples information from GNSS, IMU, and vision. The raw data from each sensor are preprocessed, including feature tracking, IMU pre-integration, and screening of unhealthy, low-elevation satellites. A factor graph-based optimization model is developed to derive a drift-free global trajectory estimation. The factor graph incorporates visual, IMU, code pseudo-range, Doppler, and clock factors. To adapt to agricultural environments, two improvements to the visual factor are made. We introduce an IMU-assisted optical flow method to mitigate the impact of dynamic noise, such as wind-blown crops and pedestrians, on feature tracking. Additionally, we eliminate the inverse depth state quantity of the far-away feature points during the factor graph optimization process to reduce translation and scale errors introduced in the pose optimization. The system was tested in three agricultural scenarios, yielding results that demonstrated the ability of the approach to achieve an absolute localization accuracy of 0.87 m and a relative localization accuracy of 0.090 m.
期刊介绍:
Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.