Development of an autonomous navigation system for orchard spraying robots integrating a thermal camera and LiDAR using a deep learning algorithm under low- and no-light conditions
{"title":"Development of an autonomous navigation system for orchard spraying robots integrating a thermal camera and LiDAR using a deep learning algorithm under low- and no-light conditions","authors":"Ailian Jiang , Tofael Ahamed","doi":"10.1016/j.compag.2025.110359","DOIUrl":null,"url":null,"abstract":"<div><div>Pesticide spraying is an important part of agricultural production that directly affects crop yields. However, with the agricultural labor force aging globally, conventional spraying methods face challenges due to the shortage of skilled operators in orchard management. Moreover, conventional methods usually waste a large amount of pesticides, which not only increase the production cost but also results in severe environmental pollution from pesticide residues. Furthermore, it affects food safety and the ecological balance. Therefore, this study proposes a new pesticide spraying robot that can attract pests to approach a specific spraying device via light and pheromones, thus improving the accuracy of pesticide application and reducing the amount of unnecessary spraying. For the spraying robot to work properly at night when insects are active, the spraying system needs to have the ability to navigate autonomously without being affected by light. Therefore, this study uses a thermal camera and light detection and ranging (LiDAR) as sensors for navigation, target detection and image segmentation via YOLACT (You Only Look At CoefficienTs) deep learning and fuses accurate distance data from LiDAR to realize real-time navigation of the vehicle according to the position of the trees in the orchard. This method can ensure accurate navigation of the vehicle in a dense canopy orchard environment and can enable the vehicle to operate safely under low-light and no-light conditions. The real-time navigation system proposed in this study was tested during the day and night, first in an artificial tree orchard and then in a real orchard. The experimental results revealed that in the artificial tree orchard, the image segmentation mean average precision (mAP) of the box was 83.74 %, that of the mask was 81.4 %, and the average positional error from the target travel path was 0.21 m. In the real orchard, the image segmentation mAP of the box was 62.03 %, that of the mask was 58.82 %, and the average positional error was 0.20 m. The system exhibited good stability under different light conditions, including low light and no light, in orchards and provides a solution for the development of night-time applications to control insects with reduced pesticide contents.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"235 ","pages":"Article 110359"},"PeriodicalIF":7.7000,"publicationDate":"2025-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S016816992500465X","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Pesticide spraying is an important part of agricultural production that directly affects crop yields. However, with the agricultural labor force aging globally, conventional spraying methods face challenges due to the shortage of skilled operators in orchard management. Moreover, conventional methods usually waste a large amount of pesticides, which not only increase the production cost but also results in severe environmental pollution from pesticide residues. Furthermore, it affects food safety and the ecological balance. Therefore, this study proposes a new pesticide spraying robot that can attract pests to approach a specific spraying device via light and pheromones, thus improving the accuracy of pesticide application and reducing the amount of unnecessary spraying. For the spraying robot to work properly at night when insects are active, the spraying system needs to have the ability to navigate autonomously without being affected by light. Therefore, this study uses a thermal camera and light detection and ranging (LiDAR) as sensors for navigation, target detection and image segmentation via YOLACT (You Only Look At CoefficienTs) deep learning and fuses accurate distance data from LiDAR to realize real-time navigation of the vehicle according to the position of the trees in the orchard. This method can ensure accurate navigation of the vehicle in a dense canopy orchard environment and can enable the vehicle to operate safely under low-light and no-light conditions. The real-time navigation system proposed in this study was tested during the day and night, first in an artificial tree orchard and then in a real orchard. The experimental results revealed that in the artificial tree orchard, the image segmentation mean average precision (mAP) of the box was 83.74 %, that of the mask was 81.4 %, and the average positional error from the target travel path was 0.21 m. In the real orchard, the image segmentation mAP of the box was 62.03 %, that of the mask was 58.82 %, and the average positional error was 0.20 m. The system exhibited good stability under different light conditions, including low light and no light, in orchards and provides a solution for the development of night-time applications to control insects with reduced pesticide contents.
期刊介绍:
Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.