Ángeles Gallegos , Mayra E. Gavito , Heberto Ferreira-Medina , Eloy Pat , Marta Astier , Sergio Rogelio Tinoco-Martínez , Yair Merlín-Uribe , Carlos E. González-Esquivel
{"title":"Development of a color-based, non-destructive method to determine leaf N levels of Hass avocado under field conditions","authors":"Ángeles Gallegos , Mayra E. Gavito , Heberto Ferreira-Medina , Eloy Pat , Marta Astier , Sergio Rogelio Tinoco-Martínez , Yair Merlín-Uribe , Carlos E. González-Esquivel","doi":"10.1016/j.atech.2025.100895","DOIUrl":"10.1016/j.atech.2025.100895","url":null,"abstract":"<div><div>Excessive fertilization in avocado trees might be avoided by providing producers with affordable supporting tools for constant monitoring of nutrient levels. Leaf color guides have been produced for cereals and might be useful, but they are so far rare for trees because of low variation in color. We investigated the potential of leaf color to indicate N and P levels in avocado leaves to develop a monitoring tool not requiring expensive chemical analyses. We carried out three experimental phases towards the development of a solid, reproducible monitoring tool. In the first phase, we found a good relation between color and chemically-measured N levels, but not P levels. That allowed us to develop a leaf color chart only for N levels. In the second phase, this visual guide was tested using print and mobile app versions. We found that visual identification of N levels by the users was highly variable, subjective, and prone to error regardless of the materials used for detection. The third phase aimed to develop a user-independent evaluation of leaf color to define the leaf N level using leaf pictures. Machine and deep learning algorithms were used to generate, calibrate, and validate models for estimating the N concentration of avocado leaves using digital images captured in field conditions. Applying the models generated, we can now develop an automated color detection and N-level identification tool for mobile applications that will assist avocado producers in adequate application of nitrogen fertilizers, saving money and reducing N pollution from leaching in orchards.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"11 ","pages":"Article 100895"},"PeriodicalIF":6.3,"publicationDate":"2025-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143697316","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Patience Chizoba Mba, Judith Nkechinyere Njoku, Daniel Dooyum Uyeh
{"title":"Enhancing resilience in specialty crop production in a changing climate through smart systems adoption","authors":"Patience Chizoba Mba, Judith Nkechinyere Njoku, Daniel Dooyum Uyeh","doi":"10.1016/j.atech.2025.100897","DOIUrl":"10.1016/j.atech.2025.100897","url":null,"abstract":"<div><div>Climate change critically impacts agriculture, particularly specialty crop production. This paper examines the effects on high-value fruits, nuts, and herbs, emphasizing challenges in rural and developing areas. Specialty crops are susceptible to climatic variations, affecting their yield, quality, and economic viability. Changing temperature and precipitation patterns and increased pest and disease prevalence pose significant threats, leading to potential food security and economic stability issues. Integrating smart systems, such as precision agriculture and sensor technologies, offers viable solutions to mitigate these impacts. These systems enable real-time monitoring and adjustment of environmental conditions, optimizing resource usage and enhancing crop management practices. This paper highlights the importance of building resilience through innovative farming techniques, sustainable practices, and robust research. Adopting these strategies helps farmers protecting their crops against the adverse effects of climate change, ensuring long-term productivity and economic stability.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"11 ","pages":"Article 100897"},"PeriodicalIF":6.3,"publicationDate":"2025-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143747501","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Haoqi Xu , Zhenyang Zhang , Wei Zhao , Yizheng Zhuang , Xiaoliang Hou , Yongqi He , Jianlan Wang , Jiongtang Bai , Yan Fu , Zhen Wang , Yuchun Pan , Qishan Wang , Zhe Zhang
{"title":"Deep learning-driven automated carcass segmentation and composition quantification in live pigs via large-scale CT imaging and its application in genetic analysis of pig breeding","authors":"Haoqi Xu , Zhenyang Zhang , Wei Zhao , Yizheng Zhuang , Xiaoliang Hou , Yongqi He , Jianlan Wang , Jiongtang Bai , Yan Fu , Zhen Wang , Yuchun Pan , Qishan Wang , Zhe Zhang","doi":"10.1016/j.atech.2025.100898","DOIUrl":"10.1016/j.atech.2025.100898","url":null,"abstract":"<div><div>Carcass segmentation and composition (CSC) traits are important indicators for assessing the economic efficiency of pig production. Conventional determination of these traits by slaughter has the drawbacks of high costs and the inability to retain breeding stock. Combining computed tomography (CT) with deep learning enables the non-invasive evaluation of live animal carcass characteristics. In this study, we proposed UPPECT for predicting CSC traits of live pigs based on deep learning. A labeled dataset comprising 300 pigs with a total of 63,708 CT images was constructed for training the nnU-Net model to automatically segment different cuts of pig carcasses. The composition quantification process was optimized using adaptive thresholding and bone filling to achieve accurate prediction of 16 CSC traits. At last, the genetic parameters of CSC traits obtained by UPPECT were estimated for 4,063 pigs. The segmentation model demonstrated excellent performance with a PA of 0.9992, an IoU of 0.9910 and an F1-score of 0.9955. We slaughtered and dissected 50 pigs to obtain real CSC trait values as the validation dataset. The results showed that our method improved the accuracy of composition quantification after optimization, and our predictions for all traits were highly correlated with manual dissection results, with correlation coefficients up to 0.9568. The heritability estimates ranged from 0.52 to 0.85 for all traits. Our study enables non-invasive and precise measurement of CSC traits of live pigs, which makes an important contribution to the breeding practice. A graphical user interface software for UPPECT is freely accessible at <span><span>https://github.com/StMerce/UPPECT</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"11 ","pages":"Article 100898"},"PeriodicalIF":6.3,"publicationDate":"2025-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143685753","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Klára Pokovai , János Mészáros , Kitti Balog , Sándor Koós , Mátyás Árvai , Nándor Fodor
{"title":"Optical leaf area assessment supports chlorophyll estimation from UAV images","authors":"Klára Pokovai , János Mészáros , Kitti Balog , Sándor Koós , Mátyás Árvai , Nándor Fodor","doi":"10.1016/j.atech.2025.100894","DOIUrl":"10.1016/j.atech.2025.100894","url":null,"abstract":"<div><div>Measurement of crop chlorophyll content provides information on expected yield at an early stage of vegetation development. Spectral vegetation indices (VIs) are closely related with crop chlorophyll content and nowadays they became common tools for estimating parameters of vegetation monitoring in field scale. Thus, the objectives of this study were to validate the correlation of VIs (calculated from drone-based hyperspectral images) with leaf chlorophyll content (LCC) and canopy chlorophyll content (CCC) of crops grown at three different nitrogen levels at two experimental sites. LCC and leaf area index (LAI) were measured with handheld devices. The effect of vegetation size, expressed as two LAI ranges of canopy, on the magnitude of the resulting correlations was investigated as well. Our results showed that for less developed vegetation (LAI < 2.7), all studied VIs are suitable for assessing chlorophyll content. However, at higher LAI values, some VIs had no significant correlation with either LCC or CCC. Based on linear regression, NDRE for less developed vegetation (LAI < 2.7), as well as NDRE, CI<sub>RE</sub> or SR<sub>RE</sub> for closed vegetation (LAI > 2.7), are recommended for monitoring chlorophyll content when the LAI of the vegetation is known and therefore the CCC can be derived. We conclude that drone imagery may greatly assists farmers in observing biophysical characteristics, but is limited for observing chlorophyll status within crops of closed vegetation size.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"11 ","pages":"Article 100894"},"PeriodicalIF":6.3,"publicationDate":"2025-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143685751","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Artificial intelligence applied to precision livestock farming: A tertiary study","authors":"Damiano Distante , Chiara Albanello , Hira Zaffar , Stefano Faralli , Domenico Amalfitano","doi":"10.1016/j.atech.2025.100889","DOIUrl":"10.1016/j.atech.2025.100889","url":null,"abstract":"<div><div>Recent advances in Artificial Intelligence (AI) are transforming the livestock sector by enabling continuous real-time data monitoring and automated decision support systems. While several secondary studies have explored the application of AI in Precision Livestock Farming (PLF), they often focus on specific AI techniques or particular PLF activities, limiting a broader understanding of the field. This study aims to provide a comprehensive overview of the state-of-the-art of AI applications in PLF, highlighting both achievements and areas that require further investigation. To this end, a tertiary systematic mapping study was conducted following recognized guidelines to ensure reliability and replicability. The research process involved formulating 10 research questions, designing a comprehensive search strategy, and performing a rigorous quality assessment of the identified studies. From an initial pool of 738 retrieved manuscripts, 14 high-quality secondary studies were selected and analyzed. The findings reveal a wide range of AI techniques applied in PLF, particularly in the learning and perception AI domains. These techniques have proven effective in tasks such as animal recognition, abnormality detection, and health and welfare monitoring. However, comparatively less attention has been given to environmental monitoring and sustainability, highlighting an area that warrants further exploration. By offering valuable insights for future research and practical applications, this study suggests directions for both researchers and livestock farmers to unlock AI's full potential in PLF.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"11 ","pages":"Article 100889"},"PeriodicalIF":6.3,"publicationDate":"2025-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143637568","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tommaso Adamo , Danilo Caivano , Lucio Colizzi , Giovanni Dimauro , Emanuela Guerriero
{"title":"Optimization of irrigation and fertigation in smart agriculture: An IoT-based micro-services framework","authors":"Tommaso Adamo , Danilo Caivano , Lucio Colizzi , Giovanni Dimauro , Emanuela Guerriero","doi":"10.1016/j.atech.2025.100885","DOIUrl":"10.1016/j.atech.2025.100885","url":null,"abstract":"<div><div>Efficient management of water and fertilizer resources is crucial for achieving sustainability and productivity in agriculture. This paper presents an AI-powered microservices solution that optimizes irrigation and fertigation practices. The proposed system integrates IoT nodes for real-time data collection on environmental conditions, soil moisture levels, and nutrient crop needs. Fertigation and irrigation decision-making are modeled as a data-driven sequential decision problem. At each decision stage, real-time data serve as input to an AI planning model aimed at satisfying nutrient and water demands while minimizing water and fertilizer waste. The system allows supervision by the farmer through a mobile app and a Digital Twin, enabling the design of crop planting layouts and providing detailed information on real-time decisions implemented in the field, as well as water and fertilizer consumption. The proposed solution manages diverse crop species with distinct water and nutrient requirements. Efficient data exchange is facilitated through a push-pull communication paradigm between the IoT nodes and cloud services. This approach offers several benefits, including greater control over data flow, energy savings, and increased flexibility in resource management.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"11 ","pages":"Article 100885"},"PeriodicalIF":6.3,"publicationDate":"2025-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143620831","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tao Jin, Kun Liang, Mengxuan Lu, Yingshuai Zhao, Yangrui Xu
{"title":"WeedsSORT: A weed tracking-by-detection framework for laser weeding applications within precision agriculture","authors":"Tao Jin, Kun Liang, Mengxuan Lu, Yingshuai Zhao, Yangrui Xu","doi":"10.1016/j.atech.2025.100883","DOIUrl":"10.1016/j.atech.2025.100883","url":null,"abstract":"<div><div>In precision agriculture, the application of artificial intelligence and high-power laser technology for weed control offers significant efficiency and accuracy advantages. However, it still encounters numerous challenges in the detection and tracking of weed targets. In terms of object detection, the variability in the size and specifications of weeds can result in the missed detection of smaller weed targets. Regarding tracking prediction, the similarity in weed shapes may result in reduced pose estimation accuracy, and the random motion of cameras within laser weeding systems further increases the risk of tracking failures. To address these challenges, this study introduces a spatial attention mechanism to enhance weed detection accuracy. It employs optimized multi-feature layer extraction and optimal feature matching algorithms to derive motion estimation results. Ultimately, an adaptive extended Kalman filtering algorithm is integrated to establish a weed tracking algorithm that correlates temporal and spatial information, ultimately achieving rapid and precise detection and tracking of weeds in laser weeding scenarios. The detection accuracy of the optimized algorithm was tested on both publicly available datasets and self-collected detection datasets, achieving a mean Average Precision (mAP) of 97.29% and 85.83%, respectively. Furthermore, tracking performance was evaluated using the LettuceMOT dataset and the self-collected WeedsMOT dataset, demonstrating improvements in Higher-Order Tracking Accuracy (HOTA) accuracy of 12.01% and 8.75% when compared to the ByteTrack and DeepOCSORT algorithms. The experimental findings substantiate the efficacy of the proposed weed detection and tracking algorithm, offering a valuable reference for the progression of laser weeding technology within precision agriculture.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"11 ","pages":"Article 100883"},"PeriodicalIF":6.3,"publicationDate":"2025-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143610602","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Boyi Tang , Jingping Zhou , XiaoLan Li , Yuchun Pan , Yao Lu , Chang Liu , Kai Ma , Xuguang Sun , Dong Chen , Xiaohe Gu
{"title":"Detecting tasseling rate of breeding maize using UAV-based RGB images and STB-YOLO model","authors":"Boyi Tang , Jingping Zhou , XiaoLan Li , Yuchun Pan , Yao Lu , Chang Liu , Kai Ma , Xuguang Sun , Dong Chen , Xiaohe Gu","doi":"10.1016/j.atech.2025.100893","DOIUrl":"10.1016/j.atech.2025.100893","url":null,"abstract":"<div><div>In the regions with limited light and temperature, detecting the tasseling rate of maize is crucial to optimize water and fertilizer management, adjusting harvest schedule and screen suitable varieties. Unmanned Aerial Vehicle (UAV) imaging technology offers a rapid method for detecting the maize tasseling rate. This study proposes a new detection model, STB-YOLO, based on YOLOv8 for detecting maize tasseling rate. At first, we introduced Swin Transformer blocks in the downsampling process to enhances the ability of semantic feature extraction from UAV-based RGB images. Subsequently, the Bidirectional Feature Pyramid Network is employed during the Concat fusion process. This enhances the model's ability to accurately detect and robustly handle targets of varying scales in images with different tasseling rate. Finally, a series of deep learning algorithms are compared and analyzed. Additionally, the model is analyzed in detail by ablation experiment. The results show that at imaging heights of 15 meter and 30 meter, STB-YOLO achieved a precision of 76.2 % and 72.1 %, respectively. This shows an improvement of 6.5 and 11.7 percentage over YOLOv8 and YOLOv6, respectively. The precision of tasseling rate in the test datasets reaches 78.48 % and 73.22 %, with R² of 0.71 and 0.69, respectively. The precision increases as the tasseling rate increases. When the tasseling rate exceeds 80 %, the precision reaches 93.44 % and 87.01 %, respectively. Therefore, applying the STB-YOLO deep learning algorithm to UAV imagery facilitates accurate detection of tasseling rates of breeding maize.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"11 ","pages":"Article 100893"},"PeriodicalIF":6.3,"publicationDate":"2025-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143685752","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yu Ye, Aurora González-Vidal, Miguel A. Zamora-Izquierdo, Antonio F. Skarmeta
{"title":"Transfer and deep learning models for daily reference evapotranspiration estimation and forecasting in Spain from local to national scale","authors":"Yu Ye, Aurora González-Vidal, Miguel A. Zamora-Izquierdo, Antonio F. Skarmeta","doi":"10.1016/j.atech.2025.100886","DOIUrl":"10.1016/j.atech.2025.100886","url":null,"abstract":"<div><div>Accurate estimation and forecasting of Reference Evapotranspiration (<span><math><mi>E</mi><msub><mrow><mi>T</mi></mrow><mrow><mn>0</mn></mrow></msub></math></span>) is critical for almost all agricultural activities and water resource management. However, the most commonly used Penman-Monteith model (FAO56-PM) requires a large amount of input data and it is difficult to compute for general users. Machine Learning (ML) techniques can be used to address this shortcoming. Nevertheless, most studies are site-specific and lack generalizability. This study compares standard ML and Deep Learning (DL) algorithms for estimating and forecasting daily <span><math><mi>E</mi><msub><mrow><mi>T</mi></mrow><mrow><mn>0</mn></mrow></msub></math></span> at different spatial scales in Spain. While Transfer Learning (TL) is a well-established ML technique, its application in <span><math><mi>E</mi><msub><mrow><mi>T</mi></mrow><mrow><mn>0</mn></mrow></msub></math></span> computation remains largely unexplored. We applied TL in a novel approach to retrain DL models, enabling adaptation to diverse local climatic conditions, which is particularly important in this domain. All possible combinations of FAO56-PM inputs were evaluated. The results showed that with three or more climatic variables, the TL process can consistently reduce errors by using an appropriate amount of new data to retrain the models. In estimation, with 20% (120 days) of new data, TL models can provide the same performance as if they were trained with local data, both regionally and nationally (improvement of MAE from 26.4% to 99.5%). During forecasting, we used predicted weather data as input, and despite inherent biases in some variables, the TL models successfully adapted using 9-36 days of new data, significantly improving predictive performance (reducing MAE from -1.1% to 134.3%). Thus, the TL process is highly recommended as a promising methodology for increasing the generalization capability of DL models in both daily <span><math><mi>E</mi><msub><mrow><mi>T</mi></mrow><mrow><mn>0</mn></mrow></msub></math></span> estimation and forecasting under diverse climatic conditions with limited local data.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"11 ","pages":"Article 100886"},"PeriodicalIF":6.3,"publicationDate":"2025-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143637567","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A new, low-cost ground-based NDVI sensor for manual and automated crop monitoring","authors":"Reena Macagga , Geoffroy Sossa , Yvonne Ayaribil , Rinan Bayot , Pearl Sanchez , Jürgen Augustin , Sonoko Dorothea Bellingrath-Kimura , Mathias Hoffmann","doi":"10.1016/j.atech.2025.100892","DOIUrl":"10.1016/j.atech.2025.100892","url":null,"abstract":"<div><div>Ground-based normalized difference vegetation index (NDVI) sensors are vital for accurate, localized crop condition and growth assessments, but their high cost and labor-intensive operation limit accessibility. To close this gap, this study presents a low-cost NDVI sensor priced under €250, offering an affordable yet high-accuracy crop monitoring tool. The device has dual functionality, operating in both manual (handheld) and automatic (standalone) modes, enabling continuous crop monitoring with higher temporal resolution and reduced labor costs. This study also identified and corrected the underestimation of measurements at higher NDVI values through sensor calibration. Subsequent field validation proved the accuracy of the low-cost sensor, showing a generally good overall agreement with results obtained with the reference sensor (<em>r²</em> = 0.99) after applying the derived calibration function. Extended field trials in Benin and the Philippines demonstrated the reliability of the device to adequately monitor treatment differences in various crop development and biomass accumulation. Further customization into automatic mode enabled continuous, high-frequency NDVI measurements, showing its ability to monitor crop phenological changes, such as senescence, in an additional field testing in Germany. Overall, this study demonstrates that the developed NDVI sensor device, made from affordable, off-the-shelf components, can be adapted into a scientifically usable NDVI sensor that is accurate, reliable, and cost-effective. It offers a viable alternative to expensive in-field monitoring systems and promotes accessibility to ground-based crop monitoring solutions, especially for research in the Global South.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"11 ","pages":"Article 100892"},"PeriodicalIF":6.3,"publicationDate":"2025-03-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143620829","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}