Vinay Vijayakumar , Yiannis Ampatzidis , Christian Lacerda , Tom Burks , Won Suk Lee , John Schueller
{"title":"AI-driven real-time weed detection and robotic smart spraying for optimised performance and operational speed in vegetable production","authors":"Vinay Vijayakumar , Yiannis Ampatzidis , Christian Lacerda , Tom Burks , Won Suk Lee , John Schueller","doi":"10.1016/j.biosystemseng.2025.104288","DOIUrl":null,"url":null,"abstract":"<div><div>For effective weed control in vegetable farms, enhancing precision spraying through improved real-time detection is crucial. Over the years, weed detection studies have evolved from traditional feature-based methods to deep learning approaches, particularly convolutional neural networks (CNNs). While numerous studies have focused on improving detection accuracy by experimenting with different backbones, architectures, and hyperparameter tuning, fewer have addressed the real-time implementation of these models in field conditions. Existing research primarily benchmarks model inference speed but often neglects the broader algorithmic efficiency, which includes sensor data integration, processing pipelines, and microcontroller output handling. Furthermore, real-world deployment challenges, such as camera performance at different robot speeds, the optimal operational range for high detection accuracy, and the end-to-end latency of the machine vision system, remain underexplored. This study addresses these gaps by training a custom YOLOv8 nano model to detect three weed types (broadleaf, nutsedge, and grass) and two crop types (pepper and tomato) in plasticulture beds. The system runs on a robotic smart sprayer in real time, integrating GPS and camera data while transmitting control signals to the microcontroller. Beyond detection performance, we evaluate the entire processing pipeline by measuring the total loop time and its variation with the number of detections per frame. Additionally, the optimal robot operational speed was determined, finding that 0.45–0.89 m s<sup>−1</sup> provides the best balance between detection accuracy and system responsiveness. By focusing on end-to-end real-time performance on vegetable beds, this study provides insights into the practical deployment of smart spraying, often been overlooked in prior research.</div></div>","PeriodicalId":9173,"journal":{"name":"Biosystems Engineering","volume":"259 ","pages":"Article 104288"},"PeriodicalIF":5.3000,"publicationDate":"2025-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Biosystems Engineering","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1537511025002247","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
引用次数: 0
Abstract
For effective weed control in vegetable farms, enhancing precision spraying through improved real-time detection is crucial. Over the years, weed detection studies have evolved from traditional feature-based methods to deep learning approaches, particularly convolutional neural networks (CNNs). While numerous studies have focused on improving detection accuracy by experimenting with different backbones, architectures, and hyperparameter tuning, fewer have addressed the real-time implementation of these models in field conditions. Existing research primarily benchmarks model inference speed but often neglects the broader algorithmic efficiency, which includes sensor data integration, processing pipelines, and microcontroller output handling. Furthermore, real-world deployment challenges, such as camera performance at different robot speeds, the optimal operational range for high detection accuracy, and the end-to-end latency of the machine vision system, remain underexplored. This study addresses these gaps by training a custom YOLOv8 nano model to detect three weed types (broadleaf, nutsedge, and grass) and two crop types (pepper and tomato) in plasticulture beds. The system runs on a robotic smart sprayer in real time, integrating GPS and camera data while transmitting control signals to the microcontroller. Beyond detection performance, we evaluate the entire processing pipeline by measuring the total loop time and its variation with the number of detections per frame. Additionally, the optimal robot operational speed was determined, finding that 0.45–0.89 m s−1 provides the best balance between detection accuracy and system responsiveness. By focusing on end-to-end real-time performance on vegetable beds, this study provides insights into the practical deployment of smart spraying, often been overlooked in prior research.
为了有效地控制蔬菜农场的杂草,通过改进实时检测来提高精确喷洒是至关重要的。多年来,杂草检测研究已经从传统的基于特征的方法发展到深度学习方法,特别是卷积神经网络(cnn)。虽然许多研究都专注于通过试验不同的主干、架构和超参数调优来提高检测精度,但很少有研究解决这些模型在现场条件下的实时实现问题。现有的研究主要以模型推理速度为基准,但往往忽略了更广泛的算法效率,包括传感器数据集成、处理管道和微控制器输出处理。此外,现实世界的部署挑战,如不同机器人速度下的摄像头性能、高检测精度的最佳操作范围以及机器视觉系统的端到端延迟,仍未得到充分探索。本研究通过训练定制的YOLOv8纳米模型来检测塑料栽培床上的三种杂草(阔叶草、坚果草和草)和两种作物类型(辣椒和番茄),从而解决了这些空白。该系统在机器人智能喷雾器上实时运行,集成GPS和摄像头数据,同时将控制信号传输到微控制器。除了检测性能之外,我们还通过测量总循环时间及其随每帧检测次数的变化来评估整个处理流程。此外,确定了机器人的最佳操作速度,发现0.45-0.89 m s−1在检测精度和系统响应性之间提供了最佳平衡。通过关注蔬菜床的端到端实时性能,本研究为智能喷洒的实际部署提供了见解,这在以前的研究中经常被忽视。
期刊介绍:
Biosystems Engineering publishes research in engineering and the physical sciences that represent advances in understanding or modelling of the performance of biological systems for sustainable developments in land use and the environment, agriculture and amenity, bioproduction processes and the food chain. The subject matter of the journal reflects the wide range and interdisciplinary nature of research in engineering for biological systems.