Smart agricultural technology最新文献

筛选
英文 中文
Integrating colorimetry and machine learning: an approach for optimizing fruit selection in Licania tomentosa seedling production 结合比色法和机器学习:一种优化毛毛李幼苗生产果实选择的方法
IF 6.3
Smart agricultural technology Pub Date : 2025-06-10 DOI: 10.1016/j.atech.2025.101091
Douglas Martins Santana , Júlio César Altizani-Júnior , Francisco Guilhien Gomes-Junior , Durval Dourado-Neto , Renan Caldas Umburanas , Klaus Reichardt , Fábio Oliveira Diniz
{"title":"Integrating colorimetry and machine learning: an approach for optimizing fruit selection in Licania tomentosa seedling production","authors":"Douglas Martins Santana ,&nbsp;Júlio César Altizani-Júnior ,&nbsp;Francisco Guilhien Gomes-Junior ,&nbsp;Durval Dourado-Neto ,&nbsp;Renan Caldas Umburanas ,&nbsp;Klaus Reichardt ,&nbsp;Fábio Oliveira Diniz","doi":"10.1016/j.atech.2025.101091","DOIUrl":"10.1016/j.atech.2025.101091","url":null,"abstract":"<div><div><em>Licania tomentosa</em> is a widely distributed species in Brazil, commonly used in urban landscaping and environmental restoration. Despite its potential, understanding the relationship between fruit maturation and seedling quality remains limited. This study aimed to evaluate the relationship between maturation stages - classified by epicarp coloration - and seedling performance through RGB colorimetric analysis, fruit morphometry, and the application of machine learning algorithms. Fruits were collected from mother trees and classified into four color stages based on the Munsell color chart. Digital images were analyzed to extract RGB values and morphometric parameters of the fruits using ImageJ® software. Subsequently, seedling emergence, biometric attributes, biomass accumulation, and the Dickson Quality Index (DQI) were evaluated. Yellow-Red fruits produced seedlings with higher emergence rates, greater shoot and root biomass accumulation, and higher DQI values, indicating greater seedling vigor. In contrast, Greenish Green-Yellow fruits resulted in less vigorous seedlings. The Red band was the main indicator of changes in the fruits. Morphometric parameters alone were insufficient to discriminate the maturation stages. Linear Discriminant Analysis correctly classified 90.48 % of the fruits according to their maturation stage. The integration of colorimetric data with machine learning proved to be an effective, non-destructive, and low-cost approach for optimizing seed selection. To enhance the predictive accuracy of the model it is recommended to expand the dataset under natural conditions and explore alternative color systems and complementary fruit traits.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101091"},"PeriodicalIF":6.3,"publicationDate":"2025-06-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144306460","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Monitoring runtime input data distribution for the safety of the intended functionality in perception systems 监控运行时输入数据分布,以确保感知系统中预期功能的安全性
IF 6.3
Smart agricultural technology Pub Date : 2025-06-10 DOI: 10.1016/j.atech.2025.101102
Changjoo Lee , Simon Schätzle , Stefan Andreas Lang , Timo Oksanen
{"title":"Monitoring runtime input data distribution for the safety of the intended functionality in perception systems","authors":"Changjoo Lee ,&nbsp;Simon Schätzle ,&nbsp;Stefan Andreas Lang ,&nbsp;Timo Oksanen","doi":"10.1016/j.atech.2025.101102","DOIUrl":"10.1016/j.atech.2025.101102","url":null,"abstract":"<div><div>Safe and reliable environmental perception is essential for the highly automated or even autonomous operation of agriculture machines. However, developing a functionally safe and reliable AI-powered perception system is challenging, especially in safety-critical applications, due to the nature of AI technologies. This article is motivated by the need to constrain an AI-powered perception system to work within a predefined safe envelope, ensuring that the acceptable behaviour of AI technology is maintained. The acceptable behaviour of AI technology is assessed based on the distribution of its training data. However, verifying the model’s performance becomes challenging when it encounters unseen, out-of-distribution input data. This article proposes an image quality safety model (IQSM) that estimates the confidence in the safety of the intended functionality for a runtime input image within a perception system, even when faced with unseen out-of-distribution runtime input images. If the confidence level falls below the “minimum performance threshold” required for safe operation, the IQSM detects that the intended functionality is unsafe for performing highly automated operations. On a test set of 1,592 images comprising clear, dirty, foggy, raindrop-covered, and over-exposed, IQSM classified images as safe or unsafe with accuracies ranging from 97.6 % to 98.9 %. This demonstrates its ability to effectively detect acceptable runtime input images and ensure the acceptable behaviour of an intended function in world scenarios. The IQSM can prevent malfunctions in perception systems, such as failing to detect obstacles due to adverse weather conditions. It facilitates the integration of fail-safe architectures across various applications, including highly automated agricultural machinery, thereby contributing to the safety and reliability of the intended functionality.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101102"},"PeriodicalIF":6.3,"publicationDate":"2025-06-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144262446","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Smart chickpea seed classification system based on machine vision and deep learning along with a review of challenges, limitations and future trends 基于机器视觉和深度学习的智能鹰嘴豆种子分类系统及其挑战、限制和未来趋势的综述
IF 6.3
Smart agricultural technology Pub Date : 2025-06-10 DOI: 10.1016/j.atech.2025.101093
Banu Ulu , Hamdi Ozaktan , Necati Çetin , Ahmad Jahanbakhshi , Burak Ulu , Satı Uzun , Oğuzhan Uzun
{"title":"Smart chickpea seed classification system based on machine vision and deep learning along with a review of challenges, limitations and future trends","authors":"Banu Ulu ,&nbsp;Hamdi Ozaktan ,&nbsp;Necati Çetin ,&nbsp;Ahmad Jahanbakhshi ,&nbsp;Burak Ulu ,&nbsp;Satı Uzun ,&nbsp;Oğuzhan Uzun","doi":"10.1016/j.atech.2025.101093","DOIUrl":"10.1016/j.atech.2025.101093","url":null,"abstract":"<div><div>The classification of chickpea seeds is essential for agricultural productivity, food processing, and consumer choice. Conventional classification techniques are typically subjective, labor-intensive, and prone to errors. Employing novel methodologies for categorizing nutritionally rich and economically significant items offers efficiency and practicality. This study classified 13 chickpea cultivars, cultivated under similar ecological conditions without chemical inputs, utilizing deep learning (DL) and image processing techniques. ResNet-18 and ConvNeXt_Tiny were employed to evaluate the classification efficacy of two pre-trained convolutional neural network (CNN) models. The chickpea seed images were labeled and resized to dimensions of 403 × 365 pixels, with each variation organized in folders and submitted as entries to the DL models. Experimental findings demonstrated that the ConvNeXt_Tiny and ResNet-18 classifier models successfully classified chickpea varieties into 13 distinct classes, achieving 88.27 % and 80.10 % accuracy, respectively. Furthermore, ConvNeXt_Tiny demonstrated greater sensitivity than ResNet-18 (88.43 %) while achieving superior specificity (99.02 %), accuracy (88.68 %), and F-measure (88.33 %). DL models, particularly ConvNeXt_Tiny, have significant potential for the automated classification of chickpea seeds. This technology may be included in embedded applications as a rapid and precise sorting system for the agriculture and food processing industries.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101093"},"PeriodicalIF":6.3,"publicationDate":"2025-06-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144306459","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Single-wavelength near-infrared imaging and machine learning for detecting Queensland fruit fly damage in cherries 单波长近红外成像和机器学习检测昆士兰果蝇对樱桃的伤害
IF 6.3
Smart agricultural technology Pub Date : 2025-06-09 DOI: 10.1016/j.atech.2025.101090
Maryam Yazdani , Dong Bao , Jun Zhou , Andy Wang , Rieks D. van Klinken
{"title":"Single-wavelength near-infrared imaging and machine learning for detecting Queensland fruit fly damage in cherries","authors":"Maryam Yazdani ,&nbsp;Dong Bao ,&nbsp;Jun Zhou ,&nbsp;Andy Wang ,&nbsp;Rieks D. van Klinken","doi":"10.1016/j.atech.2025.101090","DOIUrl":"10.1016/j.atech.2025.101090","url":null,"abstract":"<div><div>Efficient detection and removal of infested fruits can be a valuable tool for reducing the spread of quarantine pests through trade. Automated grading technologies offer non-destructive solutions for detecting fruit fly infestations, though current optical methods face challenges due to either high computational demands (hyperspectral) or low specificity (multi- and single-spectral). In this study, we introduced a novel imaging method and machine learning approach to detect Queensland fruit fly (Qfly) infestations in fresh cherries, at both the image and fruit levels. Using hyperspectral imaging (HSI), we identified a wavelength of 730 nm within the visible to near-infrared (NIR) spectrum as most effective for distinguishing Qfly oviposition damage from natural pigmentation and mechanical damage. A library of 1771 high-resolution, single-wavelength NIR images was created, with Qfly oviposition sites manually labelled for model training. We proposed a novel machine learning approach called the Bounding Box Histogram Fusion Classifier (BBHFC). This method transforms spot-level predictions of Qfly oviposition damage, generated by a trained object detection model, into histogram-based feature vectors. These vectors are then used for efficient and accurate image-level infestation classification. BBHFC achieved high precision, recall, and F1 scores (all &gt; 0.93), demonstrating the effectiveness of the approach. The proposed BBHFC outperformed traditional visual inspection, achieving over 89 % accuracy, compared to 60 % for manual detection. Integrating advanced imaging techniques into grading systems can significantly enhance biosecurity in horticultural industries by detecting and removing infested fruit. This technology could also supplement existing, costly, visual inspections of traded fruit that governments are required to undertake.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101090"},"PeriodicalIF":6.3,"publicationDate":"2025-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144272011","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
High-throughput 3D reconstruction of plants and its application to plant feature segmentation 植物高通量三维重建及其在植物特征分割中的应用
IF 6.3
Smart agricultural technology Pub Date : 2025-06-09 DOI: 10.1016/j.atech.2025.101063
Dong Thanh Pham , Takaya Iwakuma , Zaifei Jiang , Santy Sar , Daisuke Yasutake , Takenori Ozaki , Masaharu Koga , Yasumaru Hirai , Muneshi Mitsuoka , Muhammad Rashed Al Mamun , Koichi Nomura , Hien Bich Vo , Takashi Okayasu
{"title":"High-throughput 3D reconstruction of plants and its application to plant feature segmentation","authors":"Dong Thanh Pham ,&nbsp;Takaya Iwakuma ,&nbsp;Zaifei Jiang ,&nbsp;Santy Sar ,&nbsp;Daisuke Yasutake ,&nbsp;Takenori Ozaki ,&nbsp;Masaharu Koga ,&nbsp;Yasumaru Hirai ,&nbsp;Muneshi Mitsuoka ,&nbsp;Muhammad Rashed Al Mamun ,&nbsp;Koichi Nomura ,&nbsp;Hien Bich Vo ,&nbsp;Takashi Okayasu","doi":"10.1016/j.atech.2025.101063","DOIUrl":"10.1016/j.atech.2025.101063","url":null,"abstract":"<div><div>This study explores the development and evaluation of high-throughput 3D plant reconstruction and 2D feature segmentation methods for plant phenotyping. A robotic system was employed to collect datasets of individual cucumber plants, utilizing automated mechanisms for efficient, high-throughput data acquisition. Three types of 3D reconstruction methods called Instant-NGP, Nerfacto, and 3D Gaussian Splatting were adopted and compared in terms of rendering quality and speed. Among them, 3D Gaussian Splatting performed the best, achieving PSNR: 25, SSIM: 0.84, LPIPS: 0.20, and also an impressive rendering speed of 6.39 FPS. Novel viewpoint renderings and depth maps further demonstrated its ability to generate accurate and photo-realistic representations of plants. Additionally, rendered images were utilized for training YOLO models to segment plant features into two classes: leaf and fruit. The YOLOv11s model achieved the highest F1 Score (0.932), balancing speed and accuracy. Ultra-view renderings and segmentation results provided valuable insights into plant morphology, including leaf and fruit counts, paving the way for scalable, automated phenotyping applications. This study highlights the potential of integrating 3D Gaussian Splatting with advanced segmentation models for precise and efficient plant phenotyping.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101063"},"PeriodicalIF":6.3,"publicationDate":"2025-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144364440","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Detecting copper-based fungicides in vineyards by means of hyperspectral imagery 利用高光谱成像技术检测葡萄园中的铜基杀菌剂
IF 6.3
Smart agricultural technology Pub Date : 2025-06-08 DOI: 10.1016/j.atech.2025.101049
Ramón Sánchez , Carlos Rad , Carlos Cambra , Rocío Barros , Álvaro Herrero
{"title":"Detecting copper-based fungicides in vineyards by means of hyperspectral imagery","authors":"Ramón Sánchez ,&nbsp;Carlos Rad ,&nbsp;Carlos Cambra ,&nbsp;Rocío Barros ,&nbsp;Álvaro Herrero","doi":"10.1016/j.atech.2025.101049","DOIUrl":"10.1016/j.atech.2025.101049","url":null,"abstract":"<div><div>Fungal diseases affecting vineyards are commonly controlled using copper-based fungicides. Inaccurate application of these products usually leads to accumulations of copper in the soil. The use of spectral images in vineyards is a tool that can help in the correct application of fungicides to improve their efficiency and effectiveness. To do that, a solution is required to identify the copper deposited on the vine leaf. To bridge this gap, the present work compares images obtained with a hyperspectral camera (Pika L, Resonon) of vineyard leaves (<em>Vitis vinifera</em> L.) cv. Tempranillo treated with two copper-based products, Cuprantol duo (Syngenta, CH) and Cuprocol (Syngenta, CH). Treated leaves with both products and the corresponding blanks made with distilled water were compared. Most of the differences between treatments and products are found in the near-infrared region (700–740 nm), the green region (550 nm) and the region of (620–640 nm). Maximal spectral variations appeared in the range of 711.16–758.27 nm for wet status products, which allowed to differentiate between the areas treated with copper-based products from the blanks without product. We can conclude that using hyperspectral imagery is possible to detect leave areas treated with copper-based fungicides immediately (wet treatment) after application.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101049"},"PeriodicalIF":6.3,"publicationDate":"2025-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144306458","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Data-driven model predictive control for irrigation management in agricultural greenhouses under CO2 enrichment CO2富集条件下农业大棚灌溉管理的数据驱动模型预测控制
IF 6.3
Smart agricultural technology Pub Date : 2025-06-07 DOI: 10.1016/j.atech.2025.101071
Ikhlas Ghiat , Rajesh Govindan , Tareq Al-Ansari
{"title":"Data-driven model predictive control for irrigation management in agricultural greenhouses under CO2 enrichment","authors":"Ikhlas Ghiat ,&nbsp;Rajesh Govindan ,&nbsp;Tareq Al-Ansari","doi":"10.1016/j.atech.2025.101071","DOIUrl":"10.1016/j.atech.2025.101071","url":null,"abstract":"<div><div>The quest for sustainable agricultural practices has significantly increased, particularly in hyper-arid regions characterised by severe water scarcity, and where ensuring food security is a critical concern. This growing interest is driven by the need for agricultural self-sufficiency in the face of geopolitical uncertainties, population growth, climate change, and the need to reduce the dependency on food imports. This work delves into innovative data-driven strategies tailored for agriculture in such challenging environments, with a specific focus on harnessing the potential of CO<sub>2</sub> enrichment in enhancing resource efficiency by promoting plant growth and optimizing input use in closed greenhouse systems. In this study, a data-driven model predictive control (MPC) is employed within a greenhouse environment to optimise irrigation scheduling. The key focus is utilising the power of the extreme gradient boosting (XGBoost) model to predict dynamic transpiration rates, considering the intricate interplay of microclimate conditions and physiological variations with transpiration. The XGBoost model is configured to incorporate microclimate data encompassing solar radiation, inside temperature, inside relative humidity, and inside CO<sub>2</sub> concentration, along with vegetation indices derived from hyperspectral imaging measurements including NDVI, WBI, and PRI, serving as predictive variables. The model demonstrated a high predictive accuracy, achieving an R<sup>2</sup> of 97.1 % and an RMSE of 0.417 mmol/m<sup>2</sup>/s for transpiration estimation. The XGBoost model is then incorporated into the MPC framework to control irrigation while maintaining optimal soil moisture levels. This integration is then used to manage irrigation strategies under two distinct CO<sub>2</sub> enrichment regimes: 400 ppm and 1000 ppm. Findings of this study highlight that the MPC-based irrigation control results in water savings of 42.2 % over the course of one week of projections compared to the existing irrigation schedule under varying CO<sub>2</sub> concentrations. Furthermore, when applying the MPC model under different CO<sub>2</sub> enrichment regimes, results reveal a 34 % reduction with CO<sub>2</sub> enrichment at 1000 ppm relative to 400 ppm. This research underscores the potential of MPC in closed greenhouse environments, emphasising the advantages of advanced predictive modeling, data integration, as well as continuous rolling optimisation, for achieving optimal irrigation control. It also highlights the capacity of CO<sub>2</sub> enrichment in closed agricultural greenhouses, particularly in regions under conditions of high solar radiation, as an effective practice for reducing water consumption.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101071"},"PeriodicalIF":6.3,"publicationDate":"2025-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144480676","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Predicting footpad lesion scores of Dutch broiler flocks using routinely collected data 利用常规采集数据预测荷兰肉鸡鸡脚垫损伤评分
IF 6.3
Smart agricultural technology Pub Date : 2025-06-07 DOI: 10.1016/j.atech.2025.101080
Yara Slegers , Miel Hostens , Sjaak de Wit , Arjan Stegeman , Dan B Jensen
{"title":"Predicting footpad lesion scores of Dutch broiler flocks using routinely collected data","authors":"Yara Slegers ,&nbsp;Miel Hostens ,&nbsp;Sjaak de Wit ,&nbsp;Arjan Stegeman ,&nbsp;Dan B Jensen","doi":"10.1016/j.atech.2025.101080","DOIUrl":"10.1016/j.atech.2025.101080","url":null,"abstract":"<div><div>Footpad lesions (FPL) are a prevalent welfare concern in broilers, influenced by various factors such as farm management practices and season. In the Netherlands, FPL scores are monitored at slaughter and linked to corrective measures. Early prediction of FPL scores could enable timely interventions. This study investigated the potential of routinely collected data to predict FPL scores at slaughter. Data from 592 broiler houses, each with at least 30 consecutive flocks, across 190 farms were included. The ability of various models to predict FPL scores above or below the threshold of 80 was compared. These models included univariate dynamic linear models (DLMs); multivariate DLMs using weather data of the first week of the production cycle; and random forest models using previous flock scores or DLM output, first-week weather variables, and current and previous flock and farm characteristics. Incorporation of DLM output in the random forest model provided the numerically highest performance, although this was not significantly better than the random forest model with raw previous flock scores. This model achieved an ROC AUC of 0.70, with the best threshold yielding a sensitivity of 74.4% and specificity of 60.2%. Previous flock FPL was the most important predictor, followed by the fraction of birds thinned, flock size difference between previous and current flock, and outside humidity. These findings highlight the value of weather variables in predicting FPL scores. Future research should explore additional factors which could explain within-house variation, such as indoor climate and feed changes, to improve predictive accuracy.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101080"},"PeriodicalIF":6.3,"publicationDate":"2025-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144272013","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Toward resource-efficient UAV systems: Deep learning model compression for onboard-ready weed detection in UAV imagery 面向资源高效的无人机系统:无人机图像中机载杂草检测的深度学习模型压缩
IF 6.3
Smart agricultural technology Pub Date : 2025-06-07 DOI: 10.1016/j.atech.2025.101086
Alwaseela Abdalla , Masara M.A. Mohammed , Oluwatola Adedeji , Peter Dotray , Wenxuan Guo
{"title":"Toward resource-efficient UAV systems: Deep learning model compression for onboard-ready weed detection in UAV imagery","authors":"Alwaseela Abdalla ,&nbsp;Masara M.A. Mohammed ,&nbsp;Oluwatola Adedeji ,&nbsp;Peter Dotray ,&nbsp;Wenxuan Guo","doi":"10.1016/j.atech.2025.101086","DOIUrl":"10.1016/j.atech.2025.101086","url":null,"abstract":"<div><div>Convolutional neural networks (CNNs) have emerged as a powerful tool for detecting weeds in unmanned aerial vehicle (UAV) imagery. However, the deployment of deep learning models in UAVs intended for onboard processing is hindered by their large size, which demands significant computational resources. To address these challenges, we applied two compression techniques—pruning and quantization—both independently and in combination, to assess their effectiveness in reducing model size with minimal accuracy loss. Using the DeepLab v3+ model with various backbones, including ResNet-18, ResNet-50, MobileNet-v2, and Xception, the study systematically investigates these techniques in the context of in-field weed detection. We developed a pruning technique that gives less important parameters to be reinitialized iteratively before final pruning. The importance of each parameter was evaluated using a Taylor expansion-based criterion. We fine-tuned the pruned model on the UAV dataset to mitigate any performance loss resulting from pruning. We then applied quantization to reduce the precision of numerical parameters and improve computational efficiency. Pruning alone reduced model size by ∼55–65 % with only a 1–3 % accuracy drop, while quantization alone achieved ∼35–50 % reduction with slightly higher degradation. Combined, they yielded up to 75 % model size reduction while maintaining over 90 % accuracy, particularly for ResNet-50 and Xception, which were more resilient than MobileNet-v2. Compressed models were tested on NVIDIA Jetson AGX Xavier and Jetson AGX Orin, achieving 40.7 % and 52.3 % latency reduction respectively. These results confirm the models' efficiency and readiness for edge deployment. These results support future deployment of efficient, site-specific weed detection on UAVs. Future research will focus on deploying the compressed models in actual field operations to evaluate their real-time performance and practical effectiveness.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101086"},"PeriodicalIF":6.3,"publicationDate":"2025-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144289076","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Crop row centerline extraction method based on regional feature point clustering 基于区域特征点聚类的作物行中心线提取方法
IF 6.3
Smart agricultural technology Pub Date : 2025-06-06 DOI: 10.1016/j.atech.2025.101070
Baofeng Ji , Hang Wang , Chunhong Dong , Song Chen , Hongtao Chen , Fazhan Tao , Ji Zhang , Huitao Fan
{"title":"Crop row centerline extraction method based on regional feature point clustering","authors":"Baofeng Ji ,&nbsp;Hang Wang ,&nbsp;Chunhong Dong ,&nbsp;Song Chen ,&nbsp;Hongtao Chen ,&nbsp;Fazhan Tao ,&nbsp;Ji Zhang ,&nbsp;Huitao Fan","doi":"10.1016/j.atech.2025.101070","DOIUrl":"10.1016/j.atech.2025.101070","url":null,"abstract":"<div><div>To address the issues of low accuracy and high processing time in traditional machine vision methods for crop row detection due to varying crop types, growth backgrounds, and changes in the number of crop rows, this study proposes a crop row centerline extraction method based on regional feature point clustering. First, the 2G-R-B feature factor and an optimal adaptive threshold segmentation method are used to segment crops and background. Then, a morphological operation with closing followed by opening is applied to filter out weeds and crop edges, reducing interference. Next, Harris corner detection technology is used to extract crop feature points, and these feature points are clustered using the DBSCAN algorithm, marking each crop row with a different color. Subsequently, the image is horizontally divided into strips, and the midpoint of each cluster of feature points in each strip is extracted. Finally, the least squares method is used to fit the midpoints of each crop row to obtain the centerline of the crop row. Experimental results show that this method demonstrates high accuracy in extracting crop row centerlines for four types of crops: sweet potato, mint, corn, and peanut. The average row recognition rate reached 98.33%, and the average error angle was 0.95°. Additionally, the average processing time per image was 136.14 ms, saving an average of 69.22 ms compared to the traditional Hough transform and 87.33 ms compared to the skeleton extraction method. In summary, this method provides a more robust solution for crop rows affected by various factors in field environments.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101070"},"PeriodicalIF":6.3,"publicationDate":"2025-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144242535","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信