Computers and Electronics in Agriculture最新文献

筛选
英文 中文
Multimodal sow lameness classification method integrating spatiotemporal features 融合时空特征的多模态母猪跛行分类方法
IF 7.7 1区 农林科学
Computers and Electronics in Agriculture Pub Date : 2025-04-10 DOI: 10.1016/j.compag.2025.110363
Zekai Chen , Qiong Huang , Sumin Zhang , Xuhong Tian , Ling Yin
{"title":"Multimodal sow lameness classification method integrating spatiotemporal features","authors":"Zekai Chen ,&nbsp;Qiong Huang ,&nbsp;Sumin Zhang ,&nbsp;Xuhong Tian ,&nbsp;Ling Yin","doi":"10.1016/j.compag.2025.110363","DOIUrl":"10.1016/j.compag.2025.110363","url":null,"abstract":"<div><div>Sow lameness may result in reduced swine farming efficiency, decreased production performance, and diminished economic profitability of farms. Therefore, the automatic and accurate prediction of sow lameness is crucial for enhancing health monitoring systems and improving farm profitability. This paper introduces a Contour and Skeleton Fusion-based Multimodal Network (CSF-MN) for classifying the severity of sow lameness. The Contour Feature Classification (CFC) module within the CSF-MN framework employs the FYOLOv8s-Seg algorithm to extract contour features of sows, which are then processed by the SimTSM algorithm to train a contour classification model. Meanwhile, the Skeleton Feature Classification (SFC) module uses the FYOLOv8s-Pose algorithm for skeletal feature extraction and integrates the NLPoseC3D algorithm to train a skeletal classification model. To detect lameness, prediction confidences from both models are dynamically fused using a weight assignment mechanism. To validate the effectiveness of the method, 321 samples are randomly selected from a total of 459 samples for K-fold cross-validation. The 321 samples are divided into 10 subsets, with 8 subsets used as the training set and the remaining 2 subsets used as the validation set in each iteration. This process is repeated 10 times, and the results from all 10 iterations are used to evaluate the performance. Experimental results demonstrate that the CSF-MN network achieved an accuracy of 94.2 %, specificity of 96.8 %, and sensitivity of 97.4 % on the test set. These results indicate that the proposed approach effectively integrates spatiotemporal features from sow gait, enabling an accurate assessment of lameness severity.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"235 ","pages":"Article 110363"},"PeriodicalIF":7.7,"publicationDate":"2025-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143815821","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Orchard sweet cherry color distribution estimation from wireless sensor networks and video-based fruit detection 基于无线传感器网络的果园甜樱桃颜色分布估计和基于视频的水果检测
IF 7.7 1区 农林科学
Computers and Electronics in Agriculture Pub Date : 2025-04-10 DOI: 10.1016/j.compag.2025.110334
Luis Cossio-Montefinale , Cristóbal Quiñinao , Rodrigo Verschae
{"title":"Orchard sweet cherry color distribution estimation from wireless sensor networks and video-based fruit detection","authors":"Luis Cossio-Montefinale ,&nbsp;Cristóbal Quiñinao ,&nbsp;Rodrigo Verschae","doi":"10.1016/j.compag.2025.110334","DOIUrl":"10.1016/j.compag.2025.110334","url":null,"abstract":"<div><div>Cultivating sweet cherries is facing challenges related to the diminishing availability of critical resources, especially human labor. Recent advancements in sensors, automation, robotics, artificial intelligence, Internet of Things, and other technologies significantly impact the sweet cherry industry. These technologies are driving the transition toward more sustainable and intelligent production, improving post-harvest handling and processing of sweet cherries. The present article proposes a novel methodology for assessing the development of cherries from an agroclimatic wireless sensor network and video-based fruit detection and tracking. Climate data is collected using a few climate sensors per field, transmitted through a LoRaWAN network, and its temporal and spatial dynamics at the field level are modeled using a k-Nearest Neighbors regressor. RGB Video data is captured along rows, then fruit detection is achieved using deep learning-based methods, and fruit tracking is performed using Kalman Filters. Based on these technologies, we present two ways of assessing the maturity distribution: (i) to estimate it using video data, and (ii) to estimate it from the agroclimatic wireless sensor network data only. The methods were validated using data from five productive fields, obtaining an error rate of only 5% mean squared error in maturity estimation from agroclimatic data alone. Thus, we show that it is possible to estimate the maturity distribution solely from an agroclimatic wireless sensor network, with the system being calibrated using computer vision techniques.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"235 ","pages":"Article 110334"},"PeriodicalIF":7.7,"publicationDate":"2025-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143808636","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
NorBlueNet: Hyperspectral imaging-based hybrid CNN-transformer model for non-destructive SSC analysis in Norwegian wild blueberries NorBlueNet:基于高光谱成像的混合CNN-transformer模型,用于挪威野生蓝莓的无损SSC分析
IF 7.7 1区 农林科学
Computers and Electronics in Agriculture Pub Date : 2025-04-10 DOI: 10.1016/j.compag.2025.110340
Shanthini K.S. , Sudhish N. George , Athul Chandran O.V. , Jinumol K.M. , Keerthana P. , Jobin Francis , Sony George
{"title":"NorBlueNet: Hyperspectral imaging-based hybrid CNN-transformer model for non-destructive SSC analysis in Norwegian wild blueberries","authors":"Shanthini K.S. ,&nbsp;Sudhish N. George ,&nbsp;Athul Chandran O.V. ,&nbsp;Jinumol K.M. ,&nbsp;Keerthana P. ,&nbsp;Jobin Francis ,&nbsp;Sony George","doi":"10.1016/j.compag.2025.110340","DOIUrl":"10.1016/j.compag.2025.110340","url":null,"abstract":"<div><div>Soluble solids content (SSC) is a vital parameter in blueberries, reflecting the concentration of dissolved sugars (primarily fructose and glucose) and directly influencing the fruit’s sweetness, flavour, and ripeness. As part of this study, Norwegian wild blueberries were carefully hand-picked from a forest in Norway and subsequently imaged using a hyperspectral camera to capture their detailed spectral characteristics. This study introduces NorBlueNet, a hybrid CNN-transformer architecture, for accurately predicting SSC in wild blueberries through hyperspectral imaging and deep learning. This hybrid architecture combines CNN layers for local feature extraction and spatial hierarchy representation, followed by transformer layers that capture global relationships and long-range dependencies. The hybrid approach combines the computational advantages of CNNs with the advanced attention mechanisms of transformers, achieving enhanced accuracy while maintaining computational efficiency. A comprehensive evaluation is conducted by comparing the proposed model with two additional deep learning models on the custom dataset. The results indicate that the NorBlueNet achieves the highest prediction accuracy, with an <span><math><msup><mrow><mi>R</mi></mrow><mrow><mn>2</mn></mrow></msup></math></span> = 0.98, <em>RMSE</em> = 0.0136, and <em>RPD</em> = 9.3759 thereby demonstrating its superior performance. To foster community engagement, collaboration and facilitate re-implementation of our work, we have made our code available at:<span><span>https://github.com/NorBlueNet</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"235 ","pages":"Article 110340"},"PeriodicalIF":7.7,"publicationDate":"2025-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143815822","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Enhancing green guava segmentation with texture consistency loss and reverse attention mechanism under complex background 复杂背景下纹理一致性损失和反向注意机制增强绿番石榴分割
IF 7.7 1区 农林科学
Computers and Electronics in Agriculture Pub Date : 2025-04-10 DOI: 10.1016/j.compag.2025.110308
Junshu Wang , Yang Guo , Xinjie Tan , Yubin Lan , Yuxing Han
{"title":"Enhancing green guava segmentation with texture consistency loss and reverse attention mechanism under complex background","authors":"Junshu Wang ,&nbsp;Yang Guo ,&nbsp;Xinjie Tan ,&nbsp;Yubin Lan ,&nbsp;Yuxing Han","doi":"10.1016/j.compag.2025.110308","DOIUrl":"10.1016/j.compag.2025.110308","url":null,"abstract":"<div><div>Green crops like guava, unlike the majority that can be distinctly separated from the background by color contrast, often share color characteristics with the surrounding leaves, making the accuracy of crop detection and further pixel-level prediction decrease in complex natural environment. Therefore, this work took the texture and boundaries of crops as the focus, proposed a Gabor based texture consistency loss and a reverse attention module (RAM). Meanwhile, both a receptive field module (RFM) and a mutual fusion decoder (MFD) were proposed to enhance the utilization of semantic information. Finally, a stepwise prediction refinement method with the deep prediction map as prior information was designed in the model framework, realizing a further enhancement of the inference ability. In the ablation experiments, this work verified the effectiveness of the proposed improvements step by step using Classification Evaluation Metrics and provided the visualization of the reverse attention. In the comparative experiments, this model demonstrated its advantages in contrast to state-of-the-arts such as U-Net, SETR, and SegFormer. The Acc and IoU reached 0.9954 and 0.9420, exceeding those of SegFormer by 0.0087 and 0.0119 respectively, demonstrating its application potential for agricultural robot visual systems. Moreover, to further demonstrate the inference capability of the proposed model, we conducted validation on two open-source building extraction datasets, WHU and MBD, which have similar task difficulties, and achieved significant results.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"235 ","pages":"Article 110308"},"PeriodicalIF":7.7,"publicationDate":"2025-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143808362","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A collaborative robotic fleet for yield mapping and manual fruit harvesting assistance 一个协作机器人车队,用于产量测绘和人工水果收获协助
IF 7.7 1区 农林科学
Computers and Electronics in Agriculture Pub Date : 2025-04-10 DOI: 10.1016/j.compag.2025.110351
Maria Nuria Conejero , Hector Montes , Jose Maria Bengochea-Guevara , Laura Garrido-Rey , Dionisio Andújar , Angela Ribeiro
{"title":"A collaborative robotic fleet for yield mapping and manual fruit harvesting assistance","authors":"Maria Nuria Conejero ,&nbsp;Hector Montes ,&nbsp;Jose Maria Bengochea-Guevara ,&nbsp;Laura Garrido-Rey ,&nbsp;Dionisio Andújar ,&nbsp;Angela Ribeiro","doi":"10.1016/j.compag.2025.110351","DOIUrl":"10.1016/j.compag.2025.110351","url":null,"abstract":"<div><div>The increasing demand for agricultural products and rising production costs have intensified labour shortages in the agricultural sector. Manual harvesting remains essential for products with specific designations, such as wine grapes, where automated solutions cannot match human operators’ dexterity, speed, and care. Minimizing transportation time is also crucial for preserving produce quality and optimizing efficiency. This study aims to optimize harvesting efficiency and vineyard management through the design and implementation of a mobile robotic platform. The platform combines operator dexterity with robotic assistance, continuously tracking operators as they deposit harvested grapes into a harvesting box carried by a robot while gathering data for yield map development. Adaptable to various manual fruit-picking processes, the platform can be integrated into a collaborative harvesting assistance fleet. Field experiments conducted at the Bodegas Terras Gauda (UTM coordinates: 41.95, −8.80, O Rosal, Pontevedra, Spain) vineyard, indicated that operators using robotic assistance reduced their average harvesting time per box by 6 min, increased their total harvested yield by 72.50 kg after two hours (up to 50% more), and reduced manual labour costs by 22.50%. A yield map was developed with high-accuracy GNSS data and an industrial scale mounted on the robot. The map geolocates the weights collected with a maximum variability error of 0.11 kg and successfully expresses grapevine density variability within the same vineyard row. The system preserves produce quality during transportation and significantly eliminates physical strain among operators. These results demonstrate the potential of the robotic platform to improve the efficiency of manual harvesting while maintaining high-quality outcomes.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"235 ","pages":"Article 110351"},"PeriodicalIF":7.7,"publicationDate":"2025-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143815823","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Design and experiment of automatic docking system for liquid pesticide replenishment tube of field sprayer 田间喷雾器液体农药补充管自动对接系统的设计与试验
IF 7.7 1区 农林科学
Computers and Electronics in Agriculture Pub Date : 2025-04-09 DOI: 10.1016/j.compag.2025.110374
Yuanyuan Gao , Kangyao Feng , Xinhua Wei , Jingkai Liu , Xin Han , Yongyue Hu , Shengwei Lu , Liping Chen
{"title":"Design and experiment of automatic docking system for liquid pesticide replenishment tube of field sprayer","authors":"Yuanyuan Gao ,&nbsp;Kangyao Feng ,&nbsp;Xinhua Wei ,&nbsp;Jingkai Liu ,&nbsp;Xin Han ,&nbsp;Yongyue Hu ,&nbsp;Shengwei Lu ,&nbsp;Liping Chen","doi":"10.1016/j.compag.2025.110374","DOIUrl":"10.1016/j.compag.2025.110374","url":null,"abstract":"<div><div>Field sprayers are constrained by the capacity of their pesticide tanks. During continuous operation, the operation process needs to be frequently interrupted for station-based replenishment or manual-assisted replenishment, significantly reducing the operational area and efficiency. To address the low efficiency of artificial replenishment of liquid pesticide and insufficient automation of liquid medicine replenishment in large-scale farms, this study develops an automatic docking system for liquid pesticide replenishment tube (LPRT) targeting the field sprayer. The kinematic workspace analysis and robotic structure design were carried out by establishing the kinematic model of the docking robotic arm (DRA). Based on the requirements of replenishment operations, the structure of the pesticide tank filling opening is improved, and a YOLOv5s-based recognition and positioning method for the pesticide tank filling opening is proposed. Additionally, a flexible guidance and automatic retraction mechanism for the LPRT was designed, along with a visual servo-based precise docking algorithm. The communication protocol between each module based on the CAN bus is formulated, and the automatic replenishment monitoring system for the field sprayer is developed. The system’s performance is tested through actual experiments. The visual recognition tests showed that the improved algorithm achieved a recognition confidence level exceeding 0.95 under different lighting conditions, which can adapt to the complex scene requirements of field operations. The performance test results of the pesticide tank filling opening recognition show that the pesticide tank filling opening recognition algorithm has good applicability and stability within the replenishment operation range, the best effect is achieved when the recognition distance is 30–60 cm and the recognition angle is above 30°. Further field docking tests revealed that the average operation time of the DRA is 35.64 s, the position repeated positioning accuracy is 4.12 cm, and the error ratio is 8 %, meeting the requirements for automated replenishment between the field sprayer and the replenishment vehicle. This study provides a practical solution and technical reference for improving the automation of pesticide replenishment in field sprayers, thereby enhancing the level of unmanned agricultural applications.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"235 ","pages":"Article 110374"},"PeriodicalIF":7.7,"publicationDate":"2025-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143808749","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Deciphering the UAV-LiDAR contribution to vegetation classification using interpretable machine learning 利用可解释的机器学习破译无人机-激光雷达对植被分类的贡献
IF 7.7 1区 农林科学
Computers and Electronics in Agriculture Pub Date : 2025-04-09 DOI: 10.1016/j.compag.2025.110360
Tao Huang , Lei Jiao , Yingfei Bai , Jianwu Yan , Xiping Yang , Jiayu Liu , Wei Liang , Da Luo , Liwei Zhang , Hao Wang , Zhaolin Li , Zongshan Li , Ni Ji , Guangyao Gao
{"title":"Deciphering the UAV-LiDAR contribution to vegetation classification using interpretable machine learning","authors":"Tao Huang ,&nbsp;Lei Jiao ,&nbsp;Yingfei Bai ,&nbsp;Jianwu Yan ,&nbsp;Xiping Yang ,&nbsp;Jiayu Liu ,&nbsp;Wei Liang ,&nbsp;Da Luo ,&nbsp;Liwei Zhang ,&nbsp;Hao Wang ,&nbsp;Zhaolin Li ,&nbsp;Zongshan Li ,&nbsp;Ni Ji ,&nbsp;Guangyao Gao","doi":"10.1016/j.compag.2025.110360","DOIUrl":"10.1016/j.compag.2025.110360","url":null,"abstract":"<div><div>Accurate classification of land cover types is a prerequisite for the protection of natural ecosystems. In particular, understanding the spatial distributions of different vegetation types is essential for the effective management, monitoring, and conservation of forest ecosystems. Satellite remote sensing uses rich spectral band information for land cover classification, but it is usually insufficient for high-precision vegetation classification work in small areas. However, the structure and vegetation information provided by Aerial LiDAR Scanning (ALS) can significantly increase the classification accuracy. To address these limitations, this study utilized high-resolution unmanned aerial vehicle (UAV) imagery and aerial LiDAR point cloud data to improve the accuracy of vegetation classification and plantation observation at the catchment scale. Using Google Earth Engine (GEE), spectral, textural, and LiDAR-derived topographic and vegetation features are extracted and integrated, followed by supervised classification using Random Forest (RF) and Support Vector Machine (SVM) models. This approach enhances the accuracy and efficiency of vegetation classification at the catchment scale. The classification results of SVM and RF demonstrated that incorporating LiDAR-derived topographic and vegetation features significantly improved the classification accuracy compared to using spectral and textural features only. Specifically, the overall accuracy (OA) of the RF classification increased from 94.37 % to 99.36 %, while the kappa coefficient improved from 91.08 % to 99.01 %. Moreover, the impact threshold analysis based on SHAP values showed that canopy height, tree density, and elevation were the top three key features driving the improvement in the classification performance. This study offers new insights and methods for vegetation classification in complex ecological environments.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"235 ","pages":"Article 110360"},"PeriodicalIF":7.7,"publicationDate":"2025-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143799704","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Parameter optimization and numerical analysis of the double disc digging shovel for corn root-soil complex 玉米根土复合体双盘式挖土机参数优化及数值分析
IF 7.7 1区 农林科学
Computers and Electronics in Agriculture Pub Date : 2025-04-09 DOI: 10.1016/j.compag.2025.110386
Hailong Che , Hua Zhou , Yinping Zhang , Zhengzhong Li , Xian Wang , Jiaye Li , Jinli Chen , Hui Zhou
{"title":"Parameter optimization and numerical analysis of the double disc digging shovel for corn root-soil complex","authors":"Hailong Che ,&nbsp;Hua Zhou ,&nbsp;Yinping Zhang ,&nbsp;Zhengzhong Li ,&nbsp;Xian Wang ,&nbsp;Jiaye Li ,&nbsp;Jinli Chen ,&nbsp;Hui Zhou","doi":"10.1016/j.compag.2025.110386","DOIUrl":"10.1016/j.compag.2025.110386","url":null,"abstract":"<div><div>Under conservation tillage conditions, the presence of corn stubble has become the main obstacle affecting no-till planting. To address this issue, we propose a method of digging followed by processing. A double disc digging shovel (D-D-S) is designed while minimizing soil disturbance. By constructing a kinematic model, the mathematical relationship between its motion characteristics and key parameters is revealed. Furthermore, in-depth analysis is conducted on the mechanical behavior of D-D-S, and corresponding mechanical model is established based on this. The results of the parameter optimization tests show that when the horizontal deflection angle of the discs is 31°, the vertical deflection angle is 27°, and the spacing is 0.5 cm, the relative errors of forward displacement of the corn root-soil complex (CRSC), longitudinal displacement of CRSC, soil disturbance area, and draught force compared to the predicted values are 8.2 %, 9.3 %, 9.6 %, and 6.8 %, respectively. The movement of particles during the digging and throwing process of the D-D-S is analyzed using DEM. The D-D-S can not only dig the CRSC and lift it to an appropriate height but also backfill a certain amount of the digging soil. Comparisons between simulation test and validation test show that the soil disturbance area of both methods differ by only 7.84 %. The relative errors of the forward and longitudinal displacements of the CRSC are 8.52 % and 4.05 %, respectively, and the verification test and simulation results are basically consistent. The developed D-D-S is of significant importance for the treatment of corn stubble under conservation tillage conditions.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"235 ","pages":"Article 110386"},"PeriodicalIF":7.7,"publicationDate":"2025-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143808361","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Appearance quality identification and environmental factors tracing of Lyophyllum decastes for precise environment control using knowledge graph 利用知识图谱技术进行冬虫夏草的外观质量鉴定和环境因子溯源,实现对环境的精确控制
IF 7.7 1区 农林科学
Computers and Electronics in Agriculture Pub Date : 2025-04-09 DOI: 10.1016/j.compag.2025.110369
Kai Zhou , Junyuan Yu , Haotong Shi , Rui Hou , Huarui Wu , Jialin Hou
{"title":"Appearance quality identification and environmental factors tracing of Lyophyllum decastes for precise environment control using knowledge graph","authors":"Kai Zhou ,&nbsp;Junyuan Yu ,&nbsp;Haotong Shi ,&nbsp;Rui Hou ,&nbsp;Huarui Wu ,&nbsp;Jialin Hou","doi":"10.1016/j.compag.2025.110369","DOIUrl":"10.1016/j.compag.2025.110369","url":null,"abstract":"<div><div>In the factory production of <em>Lyophyllum decastes</em>, inappropriate cultivation environments can lead to appearance quality issues, which in turn affect both yield and quality. However, the appearance characteristics of <em>Lyophyllum decastes</em> influenced by environmental factors share similarities, and the environmental factors that cause appearance quality problems exhibit coupling and complexity. Therefore, the identification of appearance characteristics and tracing of environmental factors present significant challenges. To address this issue, this paper proposes a multimodal learning network, DCRes-GAT, which integrates an improved Residual Neural Network (DCResNet) and a Graph Attention Network (GAT) to accurately identify the features of <em>Lyophyllum decastes</em>, while simultaneously tracing environmental factors and providing control recommendations. First, a knowledge graph based on the prior knowledge of quality and environmental factors is constructed, mapping this information to a point space and extracting key features. Next, DCResNet is employed to extract optical features from <em>Lyophyllum decastes</em> images. In addition, the receptive field is expanded through dilated convolutions, while pixel-level details are preserved, and a Convolutional Block Attention Module (CBAM) is incorporated to identify subtle visual differences. Finally, a dot product operation fuses point-space features with visual features, achieving accurate identification of characteristics and providing suggestions. Experimental results demonstrate that the DCRes-GAT model performs excellently, with a feature identification accuracy of 99.45%, and can precisely diagnose key environmental factors that cause appearance quality problems, achieving a diagnostic accuracy of 99.84%. This provides a basis for the precise control of the cultivation environment of <em>Lyophyllum decastes</em>.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"235 ","pages":""},"PeriodicalIF":7.7,"publicationDate":"2025-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143799341","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Detection and tracking of agricultural spray droplets using GSConv-enhanced YOLOv5s and DeepSORT 基于gsconvs增强的YOLOv5s和DeepSORT的农业喷雾液滴检测与跟踪
IF 7.7 1区 农林科学
Computers and Electronics in Agriculture Pub Date : 2025-04-09 DOI: 10.1016/j.compag.2025.110353
Chen Shengde , Liu Junyu , Xu Xiaojie , Guo Jianzhou , Hu Shiyun , Zhou Zhiyan , Lan Yubin
{"title":"Detection and tracking of agricultural spray droplets using GSConv-enhanced YOLOv5s and DeepSORT","authors":"Chen Shengde ,&nbsp;Liu Junyu ,&nbsp;Xu Xiaojie ,&nbsp;Guo Jianzhou ,&nbsp;Hu Shiyun ,&nbsp;Zhou Zhiyan ,&nbsp;Lan Yubin","doi":"10.1016/j.compag.2025.110353","DOIUrl":"10.1016/j.compag.2025.110353","url":null,"abstract":"<div><div>Accurate detection and tracking of agricultural spray droplets are crucial for optimizing spraying efficiency and ensuring uniform pesticide application. This study presents an improved droplet detection and tracking framework by enhancing the YOLOv5s model with GSConv technology, thereby improving droplet detection accuracy. To enhance tracking robustness, DeepSORT was integrated with Kalman filtering, effectively incorporating motion and appearance information. Experimental results demonstrate that the proposed method achieves a detection frame rate of 105 fps and an [email protected] of 0.9184, indicating high precision across different recall rates. Additionally, tracking performance was evaluated against manual droplet counting across five test videos, yielding a mean absolute percentage error (MAPE) of 6.434 %, further validating the accuracy and reliability of the system. These results highlight the potential of the proposed approach for real-time monitoring of spray quality, facilitating precise control of spraying parameters, and contributing to advancements in precision agriculture.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"235 ","pages":"Article 110353"},"PeriodicalIF":7.7,"publicationDate":"2025-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143799705","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信