Youssef Bouhaja , Hatim Bamoumen , Israe Derdak , Safiyah Sheikh , Moulay El Hassan El Azhari , Hamza El Hafdaoui
{"title":"Mobile robot for leaf disease detection and precise spraying: Convolutional neural networks integration and path planning","authors":"Youssef Bouhaja , Hatim Bamoumen , Israe Derdak , Safiyah Sheikh , Moulay El Hassan El Azhari , Hamza El Hafdaoui","doi":"10.1016/j.sciaf.2025.e02717","DOIUrl":null,"url":null,"abstract":"<div><div>Plant diseases are a major cause of crop yield and quality losses, particularly in tomatoes, where bacterial, fungal, and viral diseases significantly impact production. Traditional disease detection methods are slow and prone to human error, limiting their use in large-scale agriculture. This study presents a mobile robot equipped with a custom convolutional neural network (CNN)-based system for early-stage disease detection and pesticide spraying; the robot was trained and tested on 13,191 tomato leaf images, using an 80:20 train-test split. The robot features a Raspberry Pi (ARM Cortex-A72, 1.5 GHz, 4 GB RAM) for processing, an RGB camera (12 MP, 30 fps), and a LiDAR module (360° range, 12 m, 0.1° resolution) for navigation. The pesticide spraying mechanism is driven by an Arduino-controlled stepper motor (1.8° step angle) with precise 180° movement for targeted application. The system was evaluated based on performance and efficiency evaluation, cost-effectiveness, environmental impact assessment, and sensitivity analysis. In navigation tests, the robot maintained minimal deviation of 1 cm in open fields, with fast obstacle detection and path adjustment in dynamic environments, including obstacles detected within 150 milliseconds. The robot achieved a precision rate of 95 % after just 50 epochs of training with a real-time latency of 0.015 s per image classification, which significantly outperforms the highest precision rate of 91 % achieved at 70 epochs from literature, where the real-time latency exceeded 0.028 s. Validation accuracy remained between 85 % and 90 %, indicating strong generalization. Classification metrics showed exceptional performance, with accuracy, precision, recall, and F1-scores all exceeding 91 % across 10 tomato leaf classes. The confusion matrix showed minimal misclassifications, and the receiver operating characteristic curve confirmed the model’s strong ability to differentiate between healthy and diseased leaves with area under the curve values exceeding 0.90. Energy consumption was optimized, with the robot operating between 4.3 and 5.8 Watts, ensuring efficient power usage. Environmental impact assessments revealed a 40 % reduction in pesticide use and a 44.7 % decrease in worker exposure.</div><div>Sensitivity analysis showed performance variation under varying weather conditions, light variations, and environmental disturbances, with navigation accuracy dropping from 88 % at 10 °C to 75 % at 40 °C, and classification accuracy decreasing from 92.5 % at 10 °C to 77.3 % at 40 °C, with 1200 Lux light and 18 m/s wind. Additionally, energy consumption rose from 11.2 Wh at 10 °C to 18.6 Wh at 40 °C. These results demonstrate the effectiveness of the proposed system for real-time, autonomous disease management, offering a reliable and efficient solution for precision agriculture. While the system's applicability to different crops is limited by the training dataset, it can be generalized to other plant species with appropriate retraining with larger datasets. Overall, this study demonstrates the technical potential of the developed mobile robot for autonomous, real-time disease management, offering a reliable and efficient solution for precision agriculture, with considerable economic, environmental, and operational benefits.</div></div>","PeriodicalId":21690,"journal":{"name":"Scientific African","volume":"28 ","pages":"Article e02717"},"PeriodicalIF":2.7000,"publicationDate":"2025-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Scientific African","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2468227625001875","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
Plant diseases are a major cause of crop yield and quality losses, particularly in tomatoes, where bacterial, fungal, and viral diseases significantly impact production. Traditional disease detection methods are slow and prone to human error, limiting their use in large-scale agriculture. This study presents a mobile robot equipped with a custom convolutional neural network (CNN)-based system for early-stage disease detection and pesticide spraying; the robot was trained and tested on 13,191 tomato leaf images, using an 80:20 train-test split. The robot features a Raspberry Pi (ARM Cortex-A72, 1.5 GHz, 4 GB RAM) for processing, an RGB camera (12 MP, 30 fps), and a LiDAR module (360° range, 12 m, 0.1° resolution) for navigation. The pesticide spraying mechanism is driven by an Arduino-controlled stepper motor (1.8° step angle) with precise 180° movement for targeted application. The system was evaluated based on performance and efficiency evaluation, cost-effectiveness, environmental impact assessment, and sensitivity analysis. In navigation tests, the robot maintained minimal deviation of 1 cm in open fields, with fast obstacle detection and path adjustment in dynamic environments, including obstacles detected within 150 milliseconds. The robot achieved a precision rate of 95 % after just 50 epochs of training with a real-time latency of 0.015 s per image classification, which significantly outperforms the highest precision rate of 91 % achieved at 70 epochs from literature, where the real-time latency exceeded 0.028 s. Validation accuracy remained between 85 % and 90 %, indicating strong generalization. Classification metrics showed exceptional performance, with accuracy, precision, recall, and F1-scores all exceeding 91 % across 10 tomato leaf classes. The confusion matrix showed minimal misclassifications, and the receiver operating characteristic curve confirmed the model’s strong ability to differentiate between healthy and diseased leaves with area under the curve values exceeding 0.90. Energy consumption was optimized, with the robot operating between 4.3 and 5.8 Watts, ensuring efficient power usage. Environmental impact assessments revealed a 40 % reduction in pesticide use and a 44.7 % decrease in worker exposure.
Sensitivity analysis showed performance variation under varying weather conditions, light variations, and environmental disturbances, with navigation accuracy dropping from 88 % at 10 °C to 75 % at 40 °C, and classification accuracy decreasing from 92.5 % at 10 °C to 77.3 % at 40 °C, with 1200 Lux light and 18 m/s wind. Additionally, energy consumption rose from 11.2 Wh at 10 °C to 18.6 Wh at 40 °C. These results demonstrate the effectiveness of the proposed system for real-time, autonomous disease management, offering a reliable and efficient solution for precision agriculture. While the system's applicability to different crops is limited by the training dataset, it can be generalized to other plant species with appropriate retraining with larger datasets. Overall, this study demonstrates the technical potential of the developed mobile robot for autonomous, real-time disease management, offering a reliable and efficient solution for precision agriculture, with considerable economic, environmental, and operational benefits.