Plant MethodsPub Date : 2025-06-04DOI: 10.1186/s13007-025-01400-w
Michelle Robin, Flavia Machado Durgante, Caroline Lorenci Mallmann, Hilana Louise Hadlich, Christine Römermann, Lucas de Souza Falcão, Caroline Dutra Lacerda, Sérgio Duvoisin, Florian Wittmann, Maria Teresa Fernandez Piedade, Jochen Schöngart, Eliane Gomes Alves
{"title":"Leaf spectroscopy as a tool for predicting the presence of isoprene emissions and terpene storage in central Amazon forest trees.","authors":"Michelle Robin, Flavia Machado Durgante, Caroline Lorenci Mallmann, Hilana Louise Hadlich, Christine Römermann, Lucas de Souza Falcão, Caroline Dutra Lacerda, Sérgio Duvoisin, Florian Wittmann, Maria Teresa Fernandez Piedade, Jochen Schöngart, Eliane Gomes Alves","doi":"10.1186/s13007-025-01400-w","DOIUrl":"10.1186/s13007-025-01400-w","url":null,"abstract":"<p><strong>Background: </strong>Volatile isoprenoids (VIs), such as isoprene, monoterpenes, and sesquiterpenes, participate in various forest-atmosphere processes ranging from plant cell regulation to atmospheric particle formation. The Amazon Forest is the greatest and most diverse source of VI emissions, but the lack of leaf-level studies and the logistical challenges of measuring in such remote and highly biodiverse sites bring high levels of uncertainty to modeled emission estimates. Studies indicate that leaf spectroscopy is an effective tool for estimating leaf morphological, physiological, and chemical traits, being a promising tool for more easily assessing VI emissions from vegetation. In this study, we tested the ability of leaf reflectance spectroscopy to predict the presence of VI emissions and storage in central Amazon Forest trees. We measured leaf-level isoprene emission capacity (E<sub>c</sub>; emission measured at standard conditions: light of 1000 µmol m<sup>- 2</sup> s<sup>- 1</sup> photosynthetically active radiation and leaf temperature of 30 ˚C), stored monoterpene and sesquiterpene contents, and hyperspectral visible to short-wave infrared (VSWIR) reflectance from dry and fresh leaves of 175 trees from 124 species of angiosperms.</p><p><strong>Results: </strong>We found that dry leaf hyperspectral reflectance data, and fresh leaf reflectance measured at selected wavelengths (616, 694, and 1155 nm), predicted the presence of isoprene emissions with accuracies of 0.67 and 0.72, respectively. Meanwhile, fresh leaf hyperspectral reflectance data predicted monoterpene and sesquiterpene storage with accuracies of 0.65 and 0.67, respectively.</p><p><strong>Conclusions: </strong>Our results indicate the possibility of using spectral readings from botanical collections or field inventories to orient sampling efforts toward potential isoprene-emitting or terpene-storing trees, or to identify key spectral features (most informative selected wavelengths) for potential future incorporation into remote sensing models. The use of spectral tools for detecting potential isoprene-emitting and terpene-storing species can help to improve current VI emission datasets, reduce modeling emission uncertainties, and contribute to a better understanding of the roles of VIs within forest-atmosphere interactions, atmospheric chemistry, and the carbon cycle.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"21 1","pages":"78"},"PeriodicalIF":4.7,"publicationDate":"2025-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12135534/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144216579","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"In situ nondestructive identification of citrus fruit ripeness via hyperspectral imaging technology.","authors":"Qi Wang, Jinzhu Lu, Yuanhong Wang, Fajun Miao, Senping Liu, Qiyang Shui, Junfeng Gao, Yingwang Gao","doi":"10.1186/s13007-025-01354-z","DOIUrl":"10.1186/s13007-025-01354-z","url":null,"abstract":"<p><p>Rapid and accurate assessment of the citrus ripening stage in the field is important for determining harvest timing and improving industrial economic efficiency; however, the lack of effective nondestructive detection methods in the current orchard leads to flaws in ripening stage assessment, which affects harvesting decisions. To solve this problem, this study utilized hyperspectral technology to collect data from 22 fruit trees in an orchard (in the range of 400-1000 nm) and explored the effectiveness of five regions of interest selection methods (x-axis, y-axis, four-quadrant, threshold segmentation, and raw) for the delineation of the citrus ripening stage. The data quality was enhanced via wavelet transform (WT)-multiple scattering correction (MSC) preprocessing, and the effective wavelengths were extracted via the successive projections algorithm (SPA). On the basis of these wavelengths, backpropagation neural network (BP) and convolutional neural network (CNN) models were built for maturity prediction. The results show that the x-axis region of interest selection method outperforms the other methods, and the SPA-BP model based on this method performs best. An accuracy of 99.19% for the correction set and 100% for the prediction set was achieved when only 0.03% of the wavelength was used. This groundbreaking study highlights the significant potential of hyperspectral technology for in situ assessment of citrus ripening stages. Furthermore, it offers crucial technical support and serves as a valuable reference for the advancement of precision agriculture.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"21 1","pages":"77"},"PeriodicalIF":4.7,"publicationDate":"2025-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12131661/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144216578","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"OpenPheno: an open-access, user-friendly, and smartphone-based software platform for instant plant phenotyping.","authors":"Tianqi Hu, Peng Shen, Yongshuai Zhang, Jiafei Zhang, Xin Li, Chuanzhen Xia, Ping Liu, Hao Lu, Tingting Wu, Zhiguo Han","doi":"10.1186/s13007-025-01395-4","DOIUrl":"10.1186/s13007-025-01395-4","url":null,"abstract":"<p><strong>Background: </strong>Plant phenotyping has become increasingly important for advancing plant science, agriculture, and biotechnology. Classic manual methods are labor-intensive and time-consuming, while existing computational tools often require advanced coding skills, high-performance hardware, or PC-based environments, making them inaccessible to non-experts, to resource-constrained users, and to field technicians.</p><p><strong>Results: </strong>To respond to these challenges, we introduce OpenPheno, an open-access, user-friendly, and smartphone-based platform encapsulated within a WeChat Mini-Program for instant plant phenotyping. The platform is designed for ease of use, enabling users to phenotype plant traits quickly and efficiently with only a smartphone at hand. We currently instantiate the use of the platform with tools such as SeedPheno, WheatHeadPheno, LeafAnglePheno, SpikeletPheno, CanopyPheno, TomatoPheno, and CornPheno; each offering specific functionalities such as seed size and count analysis, wheat head detection, leaf angle measurement, spikelet counting, canopy structure analysis, and tomato fruit measurement. In particular, OpenPheno allows developers to contribute new algorithmic tools, further expanding its capabilities to continuously facilitate the plant phenotyping community.</p><p><strong>Conclusions: </strong>By leveraging cloud computing and a widely accessible interface, OpenPheno democratizes plant phenotyping, making advanced tools available to a broader audience, including plant scientists, breeders, and even amateurs. It can function as a role in AI-driven breeding by providing the necessary data for genotype-phenotype analysis, thereby accelerating breeding programs. Its integration with smartphones also positions OpenPheno as a powerful tool in the growing field of mobile-based agricultural technologies, paving the way for more efficient, scalable, and accessible agricultural research and breeding.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"21 1","pages":"76"},"PeriodicalIF":4.7,"publicationDate":"2025-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12131570/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144209157","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"CMSAF-Net: integrative network design with enhanced decoder for precision segmentation of pear leaf diseases.","authors":"Jie Ding, Wenwen Xu, Xin Shu, Wenyu Wang, Shuxia Chen, Yunzhi Wu","doi":"10.1186/s13007-025-01392-7","DOIUrl":"10.1186/s13007-025-01392-7","url":null,"abstract":"<p><p>Pear leaf diseases represent one of the major challenges in agriculture, significantly affecting fruit quality and reducing overall yield. With the advancement of precision agriculture, accurate identification and segmentation of diseased areas are critical for targeted disease management and optimizing crop production. To address these issues, this study proposes a novel segmentation model, CMSAF-Net, for pear leaf diseases. CMSAF-Net integrates a Multi-scale Convolutional Attention Module (MBCA), a Self-adaptive Attention-augmented Upsampling Module (SAUP), and a Cross-layer Feature Alignment Module (CGAG) to enhance feature extraction, preserve edge information in complex disease regions, and optimize cross-layer information fusion. Additionally, CMSAF-Net incorporates pre-trained weights to leverage prior knowledge, accelerating convergence and improving segmentation accuracy. On a self-constructed dataset containing three types of pear leaf diseases, experimental results demonstrate that CMSAF-Net achieves 88.65%, 93.36%, and 93.86% in key metrics of MIoU, MPA, and Dice, respectively. Compared with mainstream models such as Unet++, DeepLabv3+, U <math><mmultiscripts><mrow></mrow> <mrow></mrow> <mn>2</mn></mmultiscripts> </math> -Net, and TransUNet, CMSAF-Net exhibits significant performance improvements, with MIoU increases of 2.45%, 3.86%, 2.21%, and 8.28%, respectively. This study highlights CMSAF-Net's potential for large-scale disease monitoring in intelligent agriculture, providing an efficient segmentation solution with substantial theoretical and practical implications.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"21 1","pages":"74"},"PeriodicalIF":4.7,"publicationDate":"2025-05-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12124004/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144187572","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Plant MethodsPub Date : 2025-05-30DOI: 10.1186/s13007-025-01398-1
Haixia Li, Qian Li, Chunlai Yu, Shanjun Luo
{"title":"Unified estimation of rice canopy leaf area index over multiple periods based on UAV multispectral imagery and deep learning.","authors":"Haixia Li, Qian Li, Chunlai Yu, Shanjun Luo","doi":"10.1186/s13007-025-01398-1","DOIUrl":"10.1186/s13007-025-01398-1","url":null,"abstract":"<p><strong>Background: </strong>Rice is one of the major food crops in the world, and the monitoring of its growth condition is of great significance for guaranteeing food security and promoting sustainable agricultural development. Leaf area index (LAI) is a key indicator for assessing the growth condition and yield potential of rice, and the traditional methods for obtaining LAI have problems such as low efficiency and large error. With the development of remote sensing technology, unmanned aerial multispectral remote sensing combined with deep learning technology provides a new way for efficient and accurate estimation of LAI in rice.</p><p><strong>Results: </strong>In this study, a multispectral camera mounted on a UAV was utilized to acquire rice canopy image data, and rice LAI was uniformly estimated over multiple periods by the multilayer perceptron (MLP) and convolutional neural network (CNN) models in deep learning. The results showed that the CNN model based on five-band reflectance images (490, 550, 670, 720, and 850 nm) as input after feature screening exhibited high estimation accuracy at different growth stages. Compared with the traditional MLP model with multiple vegetation indices as inputs, the CNN model could better process the original multispectral image data, effectively avoiding the problem of vegetation index saturation, and improving the accuracies by 4.89, 5.76, 10.96, 1.84 and 6.01% in the rice tillering, jointing, booting, and heading periods, respectively, and the overall accuracy was improved by 6.01%. Moreover, the model accuracies (MLP and CNN) before and after variable screening showed noticeable changes. Conducting variable screening contributed to a substantial improvement in the accuracy of rice LAI estimation.</p><p><strong>Conclusions: </strong>UAV multispectral remote sensing combined with CNN technology provides an efficient and accurate method for the unified multi-period estimation of rice LAI. Moreover, the generalization ability and adaptability of the model were further improved by rational variable screening and data enhancement techniques. This study can provide a technical support for precision agriculture and a more accurate solution for rice growth monitoring. More feature extraction and variable screening methods can be further explored in future studies by optimizing the model structure to improve the accuracy and stability of the model.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"21 1","pages":"73"},"PeriodicalIF":4.7,"publicationDate":"2025-05-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12123809/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144182631","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Plant MethodsPub Date : 2025-05-30DOI: 10.1186/s13007-025-01393-6
Guangming Li, Hongyi Ge, Yuying Jiang, Yuan Zhang, Xi Jin
{"title":"Non-destructive detection of early wheat germination via deep learning-optimized terahertz imaging.","authors":"Guangming Li, Hongyi Ge, Yuying Jiang, Yuan Zhang, Xi Jin","doi":"10.1186/s13007-025-01393-6","DOIUrl":"10.1186/s13007-025-01393-6","url":null,"abstract":"<p><p>Wheat, a major global cereal crop, is prone to quality degradation from early sprouting when stored improperly, resulting in substantial economic losses. Traditional methods for detecting early sprouting are labor-intensive and destructive, underscoring the need for rapid, non-destructive alternatives. Terahertz (THz) technology provides a promising solution due to its ability to perform non-invasive imaging of internal structures. However, current THz imaging techniques are limited by low image resolution, which restricts their practical application. We address these challenges by proposing an advanced deep learning framework for THz image classification of early sprouting wheat. We first develop an Enhanced Super-Resolution Generative Adversarial Network (AESRGAN) to improve the resolution of THz images, integrating an attention mechanism to focus on critical image regions. This model achieves a 0.76 dB improvement in Peak Signal-to-Noise Ratio (PSNR). Subsequently, we introduce the EfficientViT-based YOLO V8 classification model, incorporating a Depthwise Separable Attention (C2F-DSA) module, and further optimize the model using the Gazelle Optimization Algorithm (GOA). Experimental results demonstrate the GOA-EViTDSA-YOLO model achieves an accuracy of 97.5% and a mean Average Precision (mAP) of 0.962. The model is efficient and significantly enhances the classification of early sprouting wheat compared to other deep learning models.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"21 1","pages":"75"},"PeriodicalIF":4.7,"publicationDate":"2025-05-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12125745/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144187573","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mobile based deep CNN model for maize leaf disease detection and classification.","authors":"Getnet Tigabie Askale, Achenef Behulu Yibel, Belayneh Matebie Taye, Gashaw Desalegn Wubneh","doi":"10.1186/s13007-025-01386-5","DOIUrl":"10.1186/s13007-025-01386-5","url":null,"abstract":"<p><p>Maize is the most produced crop in the world, exceeding wheat and rice production. However, its yield is often affected by various leaf diseases. Early identification of maize leaf disease through easily accessible tool is required to increase the yield of maize. Recently, researchers have attempted to detect and classify maize leaf diseases using Deep Learning algorithms. However, to the best of the researcher's knowledge, nearly all the studies are concentrated on developing an offline model that can detect maize diseases. But, those models are not easily accessible to individuals and don't provide immediate feedback and monitoring. Thus, in this study, we developed a novel real-time, user-friendly maize leaf disease detection and classification mobile application. The VGG16, AlexNet, and ResNet50 models were implemented and compared their performance on maize disease detection and classification. A total of 4188 images of blight, common_rust, grey_leaf_spot, and healthy were used to train each model. Data augmentation techniques were applied to the dataset to increase the size of the dataset, which can also reduce model overfitting. Weighted cross-entropy loss was also employed to mitigate class-imbalance problems. After training, VGG16 achieved 95% of testing accuracy, AlexNet achieved 91%, and ResNet50 achieved 72% of testing accuracy. The VGG16 model outperformed the other models in terms of accuracy. Consequently, we deployed the VGG16 model into a mobile application to provide real-time disease detection and classification tool for farmers, extension officers, agribusiness managers, and policy-makers. The developed application will enhance early disease detection, decision making, and contribute to better crop management and food security.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"21 1","pages":"72"},"PeriodicalIF":4.7,"publicationDate":"2025-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12121153/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144182252","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Plant MethodsPub Date : 2025-05-27DOI: 10.1186/s13007-025-01390-9
Zhi-Yong Wang, Cui-Ping Zhang
{"title":"An improved chilli pepper flower detection approach based on YOLOv8.","authors":"Zhi-Yong Wang, Cui-Ping Zhang","doi":"10.1186/s13007-025-01390-9","DOIUrl":"10.1186/s13007-025-01390-9","url":null,"abstract":"<p><p>Artificial pollination can considerably improve pollination success and boost chilli pepper fruit set and quality when grown in enclosed environments (e.g., greenhouses). Artificial pollination, on the other hand, raises production costs while also necessitating specific operating abilities. The precise and efficient identification of pepper blossoms is a critical step in the development of robotic pollinators or pollination drones. In this paper, we propose a pepper flower detection method based on YOLOv8 that incorporates multi-scale, attention, and conditional information. To begin, the CBAM structure that incorporates edge information is integrated into Backbone to expand the feature extraction receptive field and facilitate the learning of long-distance dependency. The BERT model is then used to encode conditional information, which is integrated into the backbone via the ELAN layer to assist the training and inference processes. Finally, an improved MPDIoU is applied to increase detection accuracy while increasing flexibility. The experimental results show that the modification enhances the network depth and reduces the number of parameters from 4M to 2.85M, while improving the mean average accuracy (mAP) by 3.1% over the baseline approach. The study's findings can help in crop object detection. The chilli pepper flower dataset: https://drive.google.com/file/d/1cKNie_iAzx-K4iPLQRVdyiOKV1d9zHrF/view?usp=drive_link The source code is available in https://drive.google.com/drive/folders/1ubNnKu7PWYAdUXvbs4Z2OBAVcSAQ3WLd?usp=drive_link .</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"21 1","pages":"71"},"PeriodicalIF":4.7,"publicationDate":"2025-05-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12107810/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144161292","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Plant MethodsPub Date : 2025-05-26DOI: 10.1186/s13007-025-01387-4
Xuewei Wang, Jun Liu, Qian Chen
{"title":"An advanced deep learning method for pepper diseases and pests detection.","authors":"Xuewei Wang, Jun Liu, Qian Chen","doi":"10.1186/s13007-025-01387-4","DOIUrl":"10.1186/s13007-025-01387-4","url":null,"abstract":"<p><p>Despite the significant progress in deep learning-based object detection, existing models struggle to perform optimally in complex agricultural environments. To address these challenges, this study introduces YOLO-Pepper, an enhanced model designed specifically for greenhouse pepper disease and pest detection, overcoming three key obstacles: small target recognition, multi-scale feature extraction under occlusion, and real-time processing demands. Built upon YOLOv10n, YOLO-Pepper incorporates four major innovations: (1) an Adaptive Multi-Scale Feature Extraction (AMSFE) module that improves feature capture through multi-branch convolutions; (2) a Dynamic Feature Pyramid Network (DFPN) enabling context-aware feature fusion; (3) a specialized Small Detection Head (SDH) tailored for minute targets; and (4) an Inner-CIoU loss function that enhances localization accuracy by 18% compared to standard CIoU. Evaluated on a diverse dataset of 8046 annotated images, YOLO-Pepper achieves state-of-the-art performance, with 94.26% mAP@0.5 at 115.26 FPS, marking an 11.88 percentage point improvement over YOLOv10n (82.38% mAP@0.5) while maintaining a lightweight structure (2.51 M parameters, 5.15 MB model size) optimized for edge deployment. Comparative experiments highlight YOLO-Pepper's superiority over nine benchmark models, particularly in detecting small and occluded targets. By addressing computational inefficiencies and refining small object detection capabilities, YOLO-Pepper provides robust technical support for intelligent agricultural monitoring systems, making it a highly effective tool for early disease detection and integrated pest management in commercial greenhouse operations.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"21 1","pages":"70"},"PeriodicalIF":4.7,"publicationDate":"2025-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12107738/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144151444","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Improved estimation of forage nitrogen in alpine grassland by integrating Sentinel-2 and SIF data.","authors":"Yongkang Zhang, Jinlong Gao, Dongmei Zhang, Tiangang Liang, Zhiwei Wang, Xuanfan Zhang, Zhanping Ma, Jinhuan Yang","doi":"10.1186/s13007-025-01389-2","DOIUrl":"10.1186/s13007-025-01389-2","url":null,"abstract":"<p><p>Nitrogen is an essential element for the growth and reproduction of vegetation in alpine grasslands and plays a vital role in determining the nutrient-carrying capacity of plants and maintaining the balance of forage nutrition supply and demand. In recent years, the widespread application of high-resolution multispectral satellites (i.e., Sentinel-2) equipped with multiple red-edge bands has proven an effective approach for estimating forage nitrogen content in alpine grassland habitats. In addition, solar-induced chlorophyll fluorescence (SIF), as a direct probe of vegetation photosynthesis, has become an effective indicator for estimating key growth parameters of green vegetation in recent years. However, it currently unknown whether integrating SIF and Sentinel-2 satellite data can further enhance the mapping accuracy of forage nitrogen content in alpine grassland. In this study, we integrates SIF products from TanSat and Orbiting Carbon Observatory-2 (OCO-2) satellites, Sentinel-2 Multi-Spectral Instrument (MSI) data with derived vegetation indices, and field observations across phenological stages (green-up stage, vigorous growth stage, and senescence stage) in northeastern Tibetan Plateau alpine grasslands to develop support vector machine (SVM), gaussian process regression (GPR), and artificial neural network (ANN) models for regional-scale forage nitrogen estimation. The results indicated that both the Sentinel-2 (V-R<sup>2</sup> of 0.68-0.71, CVRMSE of 17.73-18.65%) and SIF data (V-R<sup>2</sup> of 0.59-0.73, CVRMSE of 17.05-21.40%) individually yielded relatively accurate estimates of the forage nitrogen. The integrated model constructed using both spectral data types explained 69-74% of the variation in forage nitrogen content, with a CVRMSE ranging from 16.89 to 17.85%, which indicates that the synergy between Sentinel-2 and SIF data can slightly enhance the model's estimation capability of forage nitrogen content. Thus, integrating Sentinel-2 and SIF data presents a potential solution for precisely measuring spatial distribution of forage nitrogen in alpine grassland at the regional scale. The proposed method provides a feasible framework for the spatiotemporal prediction of the key forage growth parameters of forage and offers a theoretical basis for determining the rational utilization of grassland resources and studying the nutritional balance between grassland and livestock.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"21 1","pages":"69"},"PeriodicalIF":4.7,"publicationDate":"2025-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12102836/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144132468","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}