Smart agricultural technology最新文献

筛选
英文 中文
Optimizing the estimation of cotton leaf SPAD and LAI values via UAV multispectral imagery and LASSO regression 基于无人机多光谱影像和LASSO回归的棉花叶片SPAD和LAI估算优化
IF 6.3
Smart agricultural technology Pub Date : 2025-06-16 DOI: 10.1016/j.atech.2025.101098
Chunli Wang , Xiao Zhang , Nannan Zhang , Huaying Guo , Hongxin Wu , Xuanzhang Wang
{"title":"Optimizing the estimation of cotton leaf SPAD and LAI values via UAV multispectral imagery and LASSO regression","authors":"Chunli Wang ,&nbsp;Xiao Zhang ,&nbsp;Nannan Zhang ,&nbsp;Huaying Guo ,&nbsp;Hongxin Wu ,&nbsp;Xuanzhang Wang","doi":"10.1016/j.atech.2025.101098","DOIUrl":"10.1016/j.atech.2025.101098","url":null,"abstract":"<div><div>Cotton (Gossypium spp.) is a vital economic crop both globally and particularly in Xinjiang, China, where its growth status is closely linked to chlorophyll content and leaf area index (LAI). Chlorophyll content is commonly measured using the soil plant analysis development (SPAD) value. This study employed multispectral remote sensing data collected by a DJI Mavic 3 M unmanned aerial vehicle (UAV) to investigate the spectral responses of canopy SPAD and LAI in cotton fields affected by Verticillium wilt in southern Xinjiang. SPAD was strongly negatively correlated with the red band (r = –0.784) and positively correlated with the red-edge (REG) band (r = 0.498), while LAI showed the strongest correlation with the near-infrared (NIR) band (r = 0.673) and a moderate correlation with the REG band (r = 0.435). Among various vegetation indices (VIs), the photochemical reflectance ratio (PPR) exhibited the highest correlation with SPAD (r = 0.84), and the excess green (EXG) index showed the strongest correlation with LAI (r = 0.92). Inversion accuracy was highest during the boll stage. The least squares method (LSM) achieved coefficient of determination (<span><math><msup><mrow><mi>R</mi></mrow><mn>2</mn></msup></math></span>) values of 0.58 for SPAD and 0.57 for LAI, while combining VIs and texture features through least absolute shrinkage and selection operator (LASSO) regression improved accuracy to 0.711 and 0.751, respectively. Comparative modeling using LSM, grey wolf optimizer–support vector machine (GWO-SVM), and ant colony optimization–random forest (ACO-RF) revealed that ACO-RF consistently outperformed the other models, particularly in capturing nonlinear relationships and multi-feature interactions. The ACO-RF model achieved <span><math><msup><mrow><mi>R</mi></mrow><mn>2</mn></msup></math></span> values of 0.898 (root mean square error, RMSE = 1.523) for SPAD and 0.893 (RMSE = 3.308) for LAI. These findings demonstrate that integrating spectral and textural features with optimized machine learning models can significantly enhance the accuracy, scalability, and cost-effectiveness of Verticillium wilt monitoring in cotton, thereby supporting early disease detection and precision agricultural management.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101098"},"PeriodicalIF":6.3,"publicationDate":"2025-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144320802","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Cattle weight estimation using 2D side-view images and estimated depth-based 3D modeling 牛体重估计使用2D侧视图图像和估计深度为基础的3D建模
IF 6.3
Smart agricultural technology Pub Date : 2025-06-16 DOI: 10.1016/j.atech.2025.101099
Guilherme Botazzo Rozendo, Maichol Dadi, Annalisa Franco, Alessandra Lumini
{"title":"Cattle weight estimation using 2D side-view images and estimated depth-based 3D modeling","authors":"Guilherme Botazzo Rozendo,&nbsp;Maichol Dadi,&nbsp;Annalisa Franco,&nbsp;Alessandra Lumini","doi":"10.1016/j.atech.2025.101099","DOIUrl":"10.1016/j.atech.2025.101099","url":null,"abstract":"<div><div>Weighing cattle is a vital practice in livestock farming, as it provides essential data for effective herd management. Recent advancements in computer vision and machine learning have led to the development of non-invasive techniques that estimate cattle weight using images. These methods offer a way to gauge weight without needing physical scales, which helps reduce stress on the animals and minimizes labor-intensive processes. However, existing techniques often rely on dorsal (top-down) views of cattle, which can be difficult to capture in practice. In this study, we propose a method for estimating cattle weight using only side-view images, which are more accessible and easier to obtain. We utilized public datasets to extract a comprehensive set of features, including body measurements and shape descriptors from the images. We also employed advanced techniques such as cattle pose estimation, segmentation, monocular depth estimation, and point cloud generation to derive volume and area features. Our goal was to extract as much relevant information as possible from the images to accurately predict the cattle's weight. We used both linear and non-linear regression models to forecast weight based on the extracted features. Our results indicate that the proposed method can accurately predict cattle weight from side-view images, providing valuable insights for livestock management and monitoring.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101099"},"PeriodicalIF":6.3,"publicationDate":"2025-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144306457","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Dairy GPT: Empowering dairy farmers to interact with numerical databases through natural language conversations 乳品GPT:使奶农能够通过自然语言对话与数字数据库进行交互
IF 6.3
Smart agricultural technology Pub Date : 2025-06-16 DOI: 10.1016/j.atech.2025.101097
Danillo Gontijo , Douglas Rolins Santana , Gustavo de Assis Costa , Victor E. Cabrera , Eduardo Noronha de Andrade Freitas
{"title":"Dairy GPT: Empowering dairy farmers to interact with numerical databases through natural language conversations","authors":"Danillo Gontijo ,&nbsp;Douglas Rolins Santana ,&nbsp;Gustavo de Assis Costa ,&nbsp;Victor E. Cabrera ,&nbsp;Eduardo Noronha de Andrade Freitas","doi":"10.1016/j.atech.2025.101097","DOIUrl":"10.1016/j.atech.2025.101097","url":null,"abstract":"<div><div>Large language models (LLMs), like GPT-4, have revolutionized artificial intelligence by enabling intuitive text and voice interactions, simplifying complex tasks, and democratizing access to AI-driven tools. However, one of their primary limitations lies in their ability to effectively handle interactions with strictly numerical data. This limitation has led to innovative solutions such as Retrieval Augmented Generation (RAG) and Natural Language to SQL (NL2SQL), which enhance their applicability in data-intensive domains. This study investigated the possibility and feasibility of using large language models (LLMs) to allow natural language interactions of dairy farmers with purely numerical databases. To support the proposed study, we constructed a dataset consisting of 25,925 daily milk production records from 85 cows, derived from real data collected at the University of Wisconsin-Madison Agricultural Research Station. Three analyses pipelines were proposed to assess the effectiveness of LLMs handling of numerical databases: Prompt Engineering (zero-shot), Retrieval-Augmented Generation (RAG), and NL2SQL with Decomposition, evaluated using a set of quantitative (5) and qualitative (5) questions. Based on these 10 questions, the NL2SQL with Decomposition achieved 80% accuracy for quantitative questions and the Zero-shot achieved 100% for qualitative questions. These results demonstrate the potential of LLMs to enhance data utilization in dairy farming. Future work will focus on refining the proposed methods and expanding their applicability to other livestock purposes.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101097"},"PeriodicalIF":6.3,"publicationDate":"2025-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144313167","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
MIF-YOLO: An Enhanced YOLO with Multi-Source Image Fusion for Autonomous Dead Chicken Detection MIF-YOLO:一种基于多源图像融合的增强YOLO自主死鸡检测方法
IF 6.3
Smart agricultural technology Pub Date : 2025-06-16 DOI: 10.1016/j.atech.2025.101104
Jiapan Li , Yan Zhang , Yong Zhang , Hongwei Shi , Xianfang Song , Chao Peng
{"title":"MIF-YOLO: An Enhanced YOLO with Multi-Source Image Fusion for Autonomous Dead Chicken Detection","authors":"Jiapan Li ,&nbsp;Yan Zhang ,&nbsp;Yong Zhang ,&nbsp;Hongwei Shi ,&nbsp;Xianfang Song ,&nbsp;Chao Peng","doi":"10.1016/j.atech.2025.101104","DOIUrl":"10.1016/j.atech.2025.101104","url":null,"abstract":"<div><div>Addressing the paucity of automated systems for the detection of dead poultry within large-scale agricultural settings, characterized by the onerous and time-consuming manual inspection processes, this study introduces an enhanced YOLO algorithm with multi-source image fusion (MIF-YOLO) for the autonomous identification of dead chicken. The proposed approach commences with the application of progressive illumination-ware fusion (PIA Fusion) to amalgamate thermal infrared and visible-light imagery, thereby accentuating the salient features indicative of dead chickens and counteracting the impact of non-uniform illumination. To address the challenge of feature extraction under conditions of significant occlusion, the model incorporates the Rep-DCNv3 module, which augments the backbone network's capacity to discern subtle characteristics of dead chickens. Additionally, an exponential moving average (EMA) attention mechanism is strategically embedded within the YOLO algorithm architecture's neck region to bolster the model's ability to discern targets under low-light scenarios, enhancing both its accuracy rates and adaptability. The loss function of the model is refined through the implementation of Modified Partial Distance-IoU (MPDIoU), facilitating a more nuanced evaluation of the overlap of objects. Validated against a dataset comprising caged white-feathered chickens procured from a farm in Suqian, Jiangsu Province, the empirical findings indicate that the model attains a precision of 99.2% and a [email protected] metric of 98.9%, surpassing the performance of existing cutting-edge methodologies. The innovative detection methodology for dead chickens ensures not only rapid detection, but also marked improvement in detection fidelity, aligning with the demands of real-time monitoring in operational agricultural contexts.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101104"},"PeriodicalIF":6.3,"publicationDate":"2025-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144291144","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
From the attitude towards digitalisation in agriculture to the acceptance of future agricultural technologies 从对农业数字化的态度到对未来农业技术的接受
IF 6.3
Smart agricultural technology Pub Date : 2025-06-16 DOI: 10.1016/j.atech.2025.101095
Linda Reissig , Michael Siegrist
{"title":"From the attitude towards digitalisation in agriculture to the acceptance of future agricultural technologies","authors":"Linda Reissig ,&nbsp;Michael Siegrist","doi":"10.1016/j.atech.2025.101095","DOIUrl":"10.1016/j.atech.2025.101095","url":null,"abstract":"<div><div>As agriculture undergoes a transformative phase propelled by technological innovations, the integration of digital farming tools is becoming increasingly prevalent in animal husbandry and arable farming. In animal husbandry, virtual fences, as a precision livestock farming technology, have emerged as a promising solution for managing livestock. Similarly, the rapid evolution of technology in arable farming continues to redefine the landscape of agricultural practices, with autonomous systems such as fully autonomous hacking robots playing a pivotal role. However, a limited understanding of the social and psychological factors and perceptions of risks and benefits influence farmers’ acceptance of these novel digital farming technologies in Switzerland. This study aimed to provide insights into farmers’ attitudes towards digital agriculture and to help understand the acceptance of digital farming technologies in the future. It sought to explore the drivers of and barriers to the acceptance of digital farming tools among family farm managers. A survey was conducted among 939 Swiss arable and animal farmers, and multiple linear regression models were used to determine robust predictors of attitude and acceptance of virtual fence technology and fully autonomous hacking robots. The results indicate that attitudes towards digital farming technologies depend on farmers’ characteristics, such as age, technology interaction affinity, education level, and digital competence, alongside their financial situation. Acceptance of virtual fences was influenced by farm characteristics (size, workforce), farmers’ perceptions (attitudes towards digital farming), digital competence, and risk–benefit perceptions. In contrast, the acceptance of fully autonomous hacking robots was influenced by farmers’ perceptions, education level, and risk–benefit perceptions. The results emphasise that the acceptance of specific technologies is driven by application-specific reasons and depends on risk–benefit assessments. The findings shed light on decision-making in digital agriculture for small-scale farms, highlighting the need for digital skill development and support for farmers in risk–benefit assessment. Recommendations include peer networks and research settings, such as model farms, to support farmers in adopting digital farming technologies.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101095"},"PeriodicalIF":6.3,"publicationDate":"2025-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144320801","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Detection of insect-damaged sunflower seeds using near-infrared hyperspectral imaging and machine learning 利用近红外高光谱成像和机器学习技术检测虫蛀向日葵种子
IF 6.3
Smart agricultural technology Pub Date : 2025-06-14 DOI: 10.1016/j.atech.2025.101110
Bright Mensah , Jarrad Prasifka , Brent Hulke , Ewumbua Monono , Xin Sun
{"title":"Detection of insect-damaged sunflower seeds using near-infrared hyperspectral imaging and machine learning","authors":"Bright Mensah ,&nbsp;Jarrad Prasifka ,&nbsp;Brent Hulke ,&nbsp;Ewumbua Monono ,&nbsp;Xin Sun","doi":"10.1016/j.atech.2025.101110","DOIUrl":"10.1016/j.atech.2025.101110","url":null,"abstract":"<div><div>Insect damage can significantly affect seed germination rates and overall seed quality, resulting in notable economic losses. Detecting insect-damaged seeds is vital for upholding food safety standards and satisfying consumer expectations in confectionery sunflower markets. To tackle this issue, this study explores the potential of hyperspectral imaging combined with machine learning to accurately classify damaged and undamaged sunflower seeds. Spectral data were acquired and preprocessed using principal component analysis (PCA) to reduce dimensionality while retaining essential spectral information. Machine learning techniques, specifically multilayer perceptron (MLP), support vector machine (SVM), random forest (RF), light gradient boosting machine (LGBM), extreme gradient boosting (XGB), gradient boosting (GB), and partial least squares discriminant analysis (PLS-DA), were trained and evaluated based on the spectral features. The results showed that MLP achieved the highest classification performance with an accuracy of 0.91 and an F1-score of 0.91, followed by SVM with an accuracy of 0.89 and an F1-score of 0.89. LGBM and RF also performed well, both achieving an accuracy of 0.88 and an F1-score of 0.88, while XGB and GB recorded accuracies of 0.85 and 0.86, respectively. In contrast, PLS-DA demonstrated the lowest performance, with accuracy falling to 0.65 and an F1-score of 0.64. These findings underscore the effectiveness of machine learning in utilizing hyperspectral data for precise seed quality assessment. Its integration into the seed sorting process can enhance seed inspections, food safety, damage scoring for scientific investigations, and ensure that only high-quality seeds are chosen for planting.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101110"},"PeriodicalIF":6.3,"publicationDate":"2025-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144306796","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Collaboration of hyperspectral data and generative adversarial networks for improved nitrogen nutrition diagnosis and nitrogen requirement estimation in winter wheat 基于高光谱数据和生成对抗网络的冬小麦氮素营养诊断和需氧量估算
IF 6.3
Smart agricultural technology Pub Date : 2025-06-14 DOI: 10.1016/j.atech.2025.101112
Changchun Li , Bo Yang , Guangsheng Zhang , Le Xu , Yinghua Jiao , Taiyi Cai , Longfei Zhou
{"title":"Collaboration of hyperspectral data and generative adversarial networks for improved nitrogen nutrition diagnosis and nitrogen requirement estimation in winter wheat","authors":"Changchun Li ,&nbsp;Bo Yang ,&nbsp;Guangsheng Zhang ,&nbsp;Le Xu ,&nbsp;Yinghua Jiao ,&nbsp;Taiyi Cai ,&nbsp;Longfei Zhou","doi":"10.1016/j.atech.2025.101112","DOIUrl":"10.1016/j.atech.2025.101112","url":null,"abstract":"<div><div>Nitrogen nutrient diagnosis and nitrogen requirement (NR) estimation are key components for accurate and precise crop fertilizer management. Owing to the limitations of field data collection, the number of measured samples is usually small and unbalanced, resulting in errors in model estimation accuracy. Challenges remain in accurately obtaining nitrogen nutrient diagnostics and estimating nitrogen fertilizer requirements. In this study, hyperspectral canopy data and measured data of winter wheat were acquired. The generative adversarial networks (GAN) was used to generate the winter wheat canopy hyperspectral dataset, and the original dataset, the GAN balanced dataset and the GAN hybrid dataset were constructed. The nitrogen concentration and biomass were estimated by combining partial least squares regression (PLSR), Gaussian process regression (GPR) and one-dimensional convolutional neural network (1D-CNN) models. Based on the estimation results, the nitrogen nutrient index (NNI) was calculated via the critical nitrogen dilution curve, and the NR estimation model was established with integrated consideration of days after sowing, nitrogen recovery efficiency, and the NNI. The results show that the GAN can meet the extension needs of small sample datasets, and the quality of the generated data is reliable enough at epoch=2000 and performs best when the amount of generated data reaches two times the original amount of data. Among the three models, GPR had the highest accuracy in estimating nitrogen concentration, whereas the 1D-CNN performed best in estimating biomass. Compared with the original dataset (R<sup>2</sup> = 0.88 for nitrogen concentration and R<sup>2</sup> = 0.82 for biomass), the R<sup>2</sup> values for nitrogen concentration and biomass estimation were 0.94 and 0.91 on the GAN balanced dataset and 0.97 and 0.92 on the GAN hybrid dataset. Compared with those of the original dataset, the R<sup>2</sup> values for estimating nitrogen concentration and biomass improved by 10.2 % and 12.1 %, respectively. R<sup>2</sup>=0.90 and RMSE=0.11 for the estimation of the winter wheat NNI based on nitrogen concentration and biomass were further obtained, with R<sup>2</sup>=0.80 and RMSE=22.86 for the estimation of NR. This study demonstrated the potential of the GAN application in hyperspectral data generation, which provides strong support for the precise management of nitrogen fertilization in winter wheat.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101112"},"PeriodicalIF":6.3,"publicationDate":"2025-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144320798","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Computer simulation of pesticide deposition and drift by conventional and intelligent air-assisted sprayers in apple orchards 传统和智能空气辅助喷雾器在苹果园中农药沉降和漂移的计算机模拟
IF 6.3
Smart agricultural technology Pub Date : 2025-06-14 DOI: 10.1016/j.atech.2025.101111
Matthew J. Herkins , Se-Woon Hong , Lingying Zhao , Heping Zhu , Hongyoung Jeon
{"title":"Computer simulation of pesticide deposition and drift by conventional and intelligent air-assisted sprayers in apple orchards","authors":"Matthew J. Herkins ,&nbsp;Se-Woon Hong ,&nbsp;Lingying Zhao ,&nbsp;Heping Zhu ,&nbsp;Hongyoung Jeon","doi":"10.1016/j.atech.2025.101111","DOIUrl":"10.1016/j.atech.2025.101111","url":null,"abstract":"<div><div>To enhance pesticide sprayer performance, a laser guided variable-rate spraying system was developed to efficiently deliver spray outputs to a variety of plants across different growth stages. However, evaluating the performance of this system using field experiments is challenging and resource intensive. The Simulation of Air-Assisted Sprayers (SAAS), a cost-effective and user-friendly computational fluid dynamics (CFD) simulation program, was used to evaluate pesticide deposition and drift in apple orchards under varying spray and weather conditions. Results indicated that pesticide deposition efficiency was highest when very fine droplets were applied to apple trees under low wind speeds (&lt; 1.79 m <em>s</em><sup>−1</sup>), low relative humidity (&lt; 30 %), and high ambient air temperatures (&gt; 20 °C). Ground deposition losses were highest when spray nozzles producing very coarse droplets were applied at low travel speeds (0.89 m <em>s</em><sup>−1</sup>), low wind speeds, and high ambient air temperatures. Airborne drift was highest when a sprayer discharged very fine droplets under low travel speeds, high wind speeds (&gt; 3.58 m/s), high relative humidity (&gt; 70 %), and low ambient air temperatures (10 °C). The simulation results showed the intelligent sprayer was expected to reduce pesticide usage by 38.4 % to 51.9 % and improve average spray efficiency by 1.6 to 3.3 times depending on the nozzle type compared to a conventional spray system. This research demonstrated the SAAS could be used to optimize pesticide applications, improve spray efficiency, and reduce environmental impact.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101111"},"PeriodicalIF":6.3,"publicationDate":"2025-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144330433","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Regression-based artificial intelligence length and weight estimation for sustainable prawn aquaculture 基于回归的对虾可持续养殖人工智能长重估计
IF 6.3
Smart agricultural technology Pub Date : 2025-06-13 DOI: 10.1016/j.atech.2025.101089
Najeebah Az-Zahra Tashim , Tiong Hoo Lim , Wafiq Zariful , Pengcheng Liu
{"title":"Regression-based artificial intelligence length and weight estimation for sustainable prawn aquaculture","authors":"Najeebah Az-Zahra Tashim ,&nbsp;Tiong Hoo Lim ,&nbsp;Wafiq Zariful ,&nbsp;Pengcheng Liu","doi":"10.1016/j.atech.2025.101089","DOIUrl":"10.1016/j.atech.2025.101089","url":null,"abstract":"<div><div>The need for sustainable aquaculture practices has become very important to ensure sufficient production in addressing the increasing global demand for seafood. In this context, accurately assessing the size and weight of prawns is pivotal for efficient farming and resource utilization, allowing farmers to make informed decisions and productions. The integration of advanced AI algorithms into aquaculture practices holds great promise for fostering sustainability, thereby enhancing the overall productivity and resilience of prawn farming in the face of growing global challenges. This paper compares different length-weight regression techniques to estimate the weight of prawns and proposed a novel Regression-based Artificial Intelligence Biomass Estimation (RAIBE) systems for prawn aquaculture. RAIBE leverages deep learning and regression models to estimate the weight from images captured from a mobile device. The proposed methodology employs YOLOv8 with Segmentation for precise prawn identification. A unique biomarker is applied to estimate the length information. Subsequently, a polynomial based regression model is selected to correlate prawn length with actual weights, utilising comprehensive datasets collected under real-world farm conditions. As many different regression approaches have been proposed for the length-weight relationship, four commonly used approaches have been analysed. Results from extensive statistical analysis revealed that the modified polynomial regression with correction factor provides the best weight prediction. The integration of these techniques has equipped farmers with a reliable tool for predicting prawn weight during the sampling process, thereby minimizing stress on the prawns, and optimizing the segregation process.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101089"},"PeriodicalIF":6.3,"publicationDate":"2025-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144272012","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Recognition of rice seedling counts in UAV remote sensing images via the YOLO algorithm 基于YOLO算法的无人机遥感影像水稻苗数识别
IF 6.3
Smart agricultural technology Pub Date : 2025-06-13 DOI: 10.1016/j.atech.2025.101107
Shengxi Chen , Wenli Li , Du Chen , Zhao Xie , Song Zhang , Fulang Cen , Xiaoyun Huang , Lei Tu , Zhenran Gao
{"title":"Recognition of rice seedling counts in UAV remote sensing images via the YOLO algorithm","authors":"Shengxi Chen ,&nbsp;Wenli Li ,&nbsp;Du Chen ,&nbsp;Zhao Xie ,&nbsp;Song Zhang ,&nbsp;Fulang Cen ,&nbsp;Xiaoyun Huang ,&nbsp;Lei Tu ,&nbsp;Zhenran Gao","doi":"10.1016/j.atech.2025.101107","DOIUrl":"10.1016/j.atech.2025.101107","url":null,"abstract":"<div><div>Accurate identification of rice seedling numbers is essential for breeding, replanting, and yield prediction. Traditional manual counting methods are inefficient and prone to error. The integration of high-resolution drone imagery with the feature extraction capabilities of deep learning offers a novel approach for identifying rice seedlings using advanced computational techniques. This study employed drone-captured images of rice seedlings taken at heights of 12 m and 15 m from two locations—Anshun City and Qianxinan Prefecture in Guizhou Province—to construct datasets containing 100, 150, and 200 images, and compared the performance of YOLOv8n, YOLOv9t, and YOLOv10n in recognizing rice seedling numbers. The results show that at a flight height of 12 m and using a dataset of 200 images, model performance was optimal, achieving mAP@50 values of 0.964, 0.936, and 0.944 for YOLOv8n, YOLOv9t, and YOLOv10n, respectively. Among these, YOLOv8n demonstrated the highest prediction accuracy for rice seedlings, with an R<sup>2</sup> value of 0.889, RMSE of 3.225, and rRMSE of 0.032. This research demonstrates that the combination of drone imagery and deep learning models enables effective large-scale counting of rice seedlings, presenting an innovative approach to rice phenotypic analysis.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101107"},"PeriodicalIF":6.3,"publicationDate":"2025-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144313876","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信