Camille Goudalier, David Mouillot, Léa Bernagou, Taha Boksmati, Caulvyn Bristol, Harry Clark, Sekar M.C. Herandarudewi, Régis Hocdé, Anna Koester, Ashlie J. McIvor, Dhivya Nair, Muhammad Rizki Nandika, Louisa Ponnampalam, Achmad Sahri, Evan Trotzuk, Nur Abidah Zaaba, Laura Mannocci
{"title":"Drone photogrammetry reveals contrasting body conditions of dugongs across the Indo‐Pacific","authors":"Camille Goudalier, David Mouillot, Léa Bernagou, Taha Boksmati, Caulvyn Bristol, Harry Clark, Sekar M.C. Herandarudewi, Régis Hocdé, Anna Koester, Ashlie J. McIvor, Dhivya Nair, Muhammad Rizki Nandika, Louisa Ponnampalam, Achmad Sahri, Evan Trotzuk, Nur Abidah Zaaba, Laura Mannocci","doi":"10.1002/rse2.70016","DOIUrl":"https://doi.org/10.1002/rse2.70016","url":null,"abstract":"The monitoring of body condition, reflecting the state of individuals' energetic reserves, can provide early warning signals of population decline, facilitating prompt conservation actions. However, environmental and anthropogenic drivers of body condition are poorly known for rare and elusive marine mammal species over their entire ranges. We assessed the global patterns and drivers of body condition for the endangered dugong (<jats:italic>Dugong dugon</jats:italic>) across its Indo‐Pacific range. To do so, we applied the body condition index (BCI) developed for the related manatee based on the ratio of umbilical girth (approximated as maximum width times π), to straight body length measured in drone images. To cover the entire dugong's range, we took advantage of drone footage published on social media. Combined with footage from scientific surveys, social media footage provided body condition estimates for 272 individual dugongs across 18 countries. Despite small sample sizes relative to local population sizes, we found that dugong BCI was better, that is, individuals were ‘plumper’, in New Caledonia, the United Arab Emirates, Australia and Qatar where populations are the largest globally. Dugong BCI was comparatively poorer in countries hosting very small dugong populations such as Mozambique, suggesting a link between body condition and population size. Using statistical models, we then investigated potential environmental and anthropogenic drivers of dugong BCI, while controlling for seasonal and individual effects. The BCI decreased with human gravity, a variable integrating human pressures on tropical reefs, but increased with GDP per capita, indicating that economic wealth positively affects dugong energetic state. The BCI also showed a dome‐shaped relationship with marine protected area coverage, suggesting that extensive spatial protection is not sufficient to maintain dugongs in good state. Our study provides the first assessment of dugong body condition through drone photogrammetry, underlining the value of this non‐invasive, fast and low‐cost approach for monitoring elusive marine mammals.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"644 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-06-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144341173","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Franziska Wolff, Tiina H. M. Kolari, Aleksi Räsänen, Teemu Tahvanainen, Pasi Korpelainen, Miguel Villoslada, Mariana Verdonen, Eliisa Lotsari, Yuwen Pang, Timo Kumpula
{"title":"Interannual spectral consistency and spatial uncertainties in UAV‐based detection of boreal and subarctic mire plant communities","authors":"Franziska Wolff, Tiina H. M. Kolari, Aleksi Räsänen, Teemu Tahvanainen, Pasi Korpelainen, Miguel Villoslada, Mariana Verdonen, Eliisa Lotsari, Yuwen Pang, Timo Kumpula","doi":"10.1002/rse2.70017","DOIUrl":"https://doi.org/10.1002/rse2.70017","url":null,"abstract":"Unoccupied Aerial Vehicle (UAV) imagery is widely used for detailed vegetation modeling and ecosystem monitoring in peatlands. Despite high‐resolution data, the spatial complexity and heterogeneity of vegetation, along with temporal fluctuations in spectral reflectance, complicate the assessment of spatial patterns in these ecosystems. We used interannual multispectral UAV data, collected at the same time of the year, from two aapa and two palsa mires in Finland. We applied Random Forest classification to map plant communities and assessed spectral, temporal and spatial consistency, class relationships and area estimates. Further, we used the class membership probabilities from the classification to derive a secondary classification map, representing the second most likely class label per‐pixel and an alternative map to account for spatial uncertainty in area estimates. The accuracies of the primary classifications varied between 66 and 85%. The best results were achieved using interannual data, improving accuracy by up to 14%‐points when compared to single‐year imagery, particularly benefiting classes with lower accuracies. Spectral and temporal inconsistencies in the UAV data collected in different years led to variations in the classifications, notably for the <jats:italic>Rubus chamaemorus</jats:italic> community in palsa mires, likely due to weather fluctuations and phenology. The transformations from primary to secondary classifications in areas of high uncertainty aligned well with the class relationships in the confusion matrix, supporting the model's reliability. Confidence interval‐based adjusted estimates aligned largely with unadjusted area estimates of the alternative map. Our findings support incorporating class membership probabilities and alternative maps to capture spatially explicit uncertainty, especially when spatial variability is high or key plant communities are involved. Our presented approach is particularly beneficial for upscaling ecological processes, such as carbon fluxes, where spatial variability is driven by plant community distribution and where informed decision‐making requires detailed spatial assessments.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"15 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-06-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144341174","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Marius Somveille, Joe Grainger‐Hull, Nicole Ferguson, Sarab S. Sethi, Fernando González‐García, Valentine Chassagnon, Cansu Oktem, Mathias Disney, Gustavo López Bautista, John Vandermeer, Ivette Perfecto
{"title":"Consistent and scalable monitoring of birds and habitats along a coffee production intensity gradient","authors":"Marius Somveille, Joe Grainger‐Hull, Nicole Ferguson, Sarab S. Sethi, Fernando González‐García, Valentine Chassagnon, Cansu Oktem, Mathias Disney, Gustavo López Bautista, John Vandermeer, Ivette Perfecto","doi":"10.1002/rse2.70015","DOIUrl":"https://doi.org/10.1002/rse2.70015","url":null,"abstract":"Land use change associated with agricultural intensification is a leading driver of biodiversity loss in the tropics. To evaluate the habitat–biodiversity relationship in production systems of tropical agricultural commodities, birds are commonly used as indicators. However, a consistent and reliable methodological approach for monitoring tropical avian communities and habitat quality in a way that is scalable is largely lacking. In this study, we examined whether the automated analysis of audio data collected by passive acoustic monitoring, together with the analysis of remote sensing data, can be used to efficiently monitor avian biodiversity along the gradient of habitat degradation associated with the intensification of coffee production. Coffee is an important crop produced in tropical forested regions, whose production is expanding and intensifying, and coffee production systems form a gradient of ecological complexity ranging from forest‐like shaded polyculture to dense sun‐exposed monoculture. We used LiDAR technology to survey the habitat, together with autonomous recording units and a vocalization classifier to assess bird community composition in a coffee landscape comprising a shade‐grown coffee farm, a sun coffee farm and a forest remnant, located in southern Mexico. We found that LiDAR can capture relevant variation in vegetation across the habitat gradient in coffee systems, specifically matching the generally observed pattern that the intensification of coffee production is associated with a decrease in vegetation density and complexity. We also found that bioacoustics can capture known functional signatures of avian communities across this habitat degradation gradient. Thus, we show that these technologies can be used in a robust way to monitor how biodiversity responds to land use intensification in the tropics. A major advantage of this approach is that it has the potential to be deployed cost‐effectively at large scales to help design and certify biodiversity‐friendly productive landscapes.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"16 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144337521","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ryan C. Blackburn, Robert Buscaglia, Andrew J. Sánchez Meador, Margaret M. Moore, Temuulen Sankey, Steven E. Sesnie
{"title":"Eigenfeature‐enhanced deep learning: advancing tree species classification in mixed conifer forests with lidar","authors":"Ryan C. Blackburn, Robert Buscaglia, Andrew J. Sánchez Meador, Margaret M. Moore, Temuulen Sankey, Steven E. Sesnie","doi":"10.1002/rse2.70014","DOIUrl":"https://doi.org/10.1002/rse2.70014","url":null,"abstract":"Accurately classifying tree species using remotely sensed data remains a significant challenge, yet it is essential for forest monitoring and understanding ecosystem dynamics over large spatial extents. While light detection and ranging (lidar) has shown promise for species classification, its accuracy typically decreases in complex forests or with lower lidar point densities. Recent advancements in lidar processing and machine learning offer new opportunities to leverage previously unavailable structural information. In this study, we present an automated machine learning pipeline that reduces practitioner burden by utilizing canonical deep learning and improved input layers through the derivation of eigenfeatures. These eigenfeatures were used as inputs for a 2D convolutional neural network (CNN) to classify seven tree species in the Mogollon Rim Ranger District of the Coconino National Forest, AZ, US. We compared eigenfeature images derived from unoccupied aerial vehicle laser scanning (UAV‐LS) and airborne laser scanning (ALS) individual tree segmentation algorithms against raw intensity and colorless control images. Remarkably, mean overall accuracies for classifying seven species reached 94.8% for ALS and 93.4% for UAV‐LS. White image types underperformed for both ALS and UAV‐LS compared to eigenfeature images, while ALS and UAV‐LS image types showed marginal differences in model performance. These results demonstrate that lower point density ALS data can achieve high classification accuracy when paired with eigenfeatures in an automated pipeline. This study advances the field by addressing species classification at scales ranging from individual trees to landscapes, offering a scalable and efficient approach for understanding tree composition in complex forests.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"47 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144252281","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Heng Zhang, Carmen Meiller, Andreas Hueni, Rosetta C. Blackman, Felix Morsdorf, Isabelle S. Helfenstein, Michael E. Schaepman, Florian Altermatt
{"title":"Hyperspectral imagery, LiDAR point clouds, and environmental DNA to assess land‐water linkage of biodiversity across aquatic functional feeding groups","authors":"Heng Zhang, Carmen Meiller, Andreas Hueni, Rosetta C. Blackman, Felix Morsdorf, Isabelle S. Helfenstein, Michael E. Schaepman, Florian Altermatt","doi":"10.1002/rse2.70010","DOIUrl":"https://doi.org/10.1002/rse2.70010","url":null,"abstract":"Different organismal functional feeding groups (FFGs) are key components of aquatic food webs and are important for sustaining ecosystem functioning in riverine ecosystems. Their distribution and diversity are tightly associated with the surrounding terrestrial landscape through land‐water linkages. Nevertheless, knowledge about the spatial extent and magnitude of these cross‐ecosystem linkages within major FFGs still remains unclear. Here, we conducted an airborne imaging spectroscopy campaign and a systematic environmental DNA (eDNA) field sampling of river water in a 740‐km<jats:sup>2</jats:sup> mountainous catchment, combined with light detection and ranging (LiDAR) point clouds, to obtain the spectral and morphological diversity of the terrestrial landscape and the diversity of major FFGs in rivers. We identified the scale of these linkages, ranging from a few hundred meters to more than 10 km, with collectors and filterers, shredders, and small invertebrate predators having local‐scale associations, while invertebrate‐eating fish, grazers, and scrapers have more landscape‐scale associations. Among all major FFGs, shredders, grazers, and scrapers in the streams had the strongest association with surrounding terrestrial vegetation. Our research reveals the reference spatial scales at which major FFGs are linked to the surrounding terrestrial landscape, providing spatially explicit evidence of the cross‐ecosystem linkages needed for conservation design and management.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"25 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144192865","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Guillaume Tougas, Christine I. B. Wallis, Etienne Laliberté, Mark Vellend
{"title":"Hyperspectral imaging has a limited ability to remotely sense the onset of beech bark disease","authors":"Guillaume Tougas, Christine I. B. Wallis, Etienne Laliberté, Mark Vellend","doi":"10.1002/rse2.70013","DOIUrl":"https://doi.org/10.1002/rse2.70013","url":null,"abstract":"Insect and pathogen outbreaks have a major impact on northern forest ecosystems. Even for pathogens that have been present in a region for decades, such as beech bark disease (BBD), new waves of tree mortality are expected. Hence, there is a need for innovative approaches to monitor disease advancement in real time. Here, we test whether airborne hyperspectral imaging – involving data from 344 wavelengths in the visible, near infrared (NIR) and short‐wave infrared (SWIR) – can be used to assess beech bark disease severity in southern Quebec, Canada. Field data on disease severity were linked to airborne hyperspectral data for individual beech crowns. Partial least‐squares regression (PLSR) models using airborne imaging spectroscopy data predicted a small proportion of the variance in beech bark disease severity: the best model had an <jats:italic>R</jats:italic><jats:sup>2</jats:sup> of only 0.09. Wavelengths with the strongest contributions were from the red‐edge region (~715 nm) and the SWIR (~1287 nm), which may suggest mediation by canopy greenness, water content, and canopy architecture. Similar models using hyperspectral data taken directly on individual leaves had no explanatory power (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0). In addition, airborne and leaf‐level hyperspectral datasets were uncorrelated. The failure of leaf‐level models suggests that canopy structure was likely responsible for the limited predictive ability of the airborne model. Somewhat better performance in predicting disease severity was found using common band ratios for canopy greenness assessment (e.g., the Green Normalized Difference Vegetation Index, gNDVI, and the Normalized Phaeophytinization Index, NPQI); these variables explained up to 19% of the variation in disease severity. Overall, we argue that the complexity of hyperspectral data is not necessary for assessing BBD spread and that spectral data in general may not provide an efficient means of improving BBD monitoring on a larger scale.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"41 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-05-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144183762","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
C. R. Sharpe, R. A. Hill, H. M. Chappell, S. E. Green, K. Holden, P. Fergus, C. Chalmers, P. A. Stephens
{"title":"Increasing citizen scientist accuracy with artificial intelligence on UK camera‐trap data","authors":"C. R. Sharpe, R. A. Hill, H. M. Chappell, S. E. Green, K. Holden, P. Fergus, C. Chalmers, P. A. Stephens","doi":"10.1002/rse2.70012","DOIUrl":"https://doi.org/10.1002/rse2.70012","url":null,"abstract":"As camera traps have become more widely used, extracting information from images at the pace they are acquired has become challenging, resulting in backlogs that delay the communication of results and the use of data for conservation and management. To ameliorate this, artificial intelligence (AI), crowdsourcing to citizen scientists and combined approaches have surfaced as solutions. Using data from the UK mammal monitoring initiative MammalWeb, we assess the accuracies of classifications from registered citizen scientists, anonymous participants and a convolutional neural network (CNN). The engagement of anonymous volunteers was facilitated by the strategic placement of MammalWeb interfaces in a natural history museum with high footfall related to the ‘Dippy on Tour’ exhibition. The accuracy of anonymous volunteer classifications gathered through public interfaces has not been reported previously, and here we consider this form of citizen science in the context of alternative forms of data acquisition. While AI models have performed well at species identification in bespoke settings, here we report model performance on a dataset for which the model in question was not explicitly trained. We also consider combining AI output with that of human volunteers to demonstrate combined workflows that produce high accuracy predictions. We find the consensus of registered users has greater overall accuracy (97%) than the consensus from anonymous contributors (71%); AI accuracy lies in between (78%). A combined approach between registered citizen scientists and AI output provides an overall accuracy of 96%. Further, when the contributions of anonymous citizen scientists are concordant with AI output, 98% accuracy can be achieved. The generality of this last finding merits further investigation, given the potential to gather classifications much more rapidly if public displays are placed in areas of high footfall. We suggest that combined approaches to image classification are optimal when the minimisation of classification errors is desired.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"18 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144097312","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Dominique Weber, Janine Bolliger, Klaus Ecker, Claude Fischer, Christian Ginzler, Martin M. Gossner, Laurent Huber, Martin K. Obrist, Florian Zellweger, Noam Levin
{"title":"Night lights from space: potential of SDGSAT‐1 for ecological applications","authors":"Dominique Weber, Janine Bolliger, Klaus Ecker, Claude Fischer, Christian Ginzler, Martin M. Gossner, Laurent Huber, Martin K. Obrist, Florian Zellweger, Noam Levin","doi":"10.1002/rse2.70011","DOIUrl":"https://doi.org/10.1002/rse2.70011","url":null,"abstract":"Light pollution affects biodiversity at all levels, from genes to ecosystems, and improved monitoring and research is needed to better assess its various ecological impacts. Here, we review the current contribution of night‐time satellites to ecological applications and elaborate on the potential value of the Glimmer sensor onboard the Chinese Sustainable Development Goals Science Satellite 1 (SDGSAT‐1), a novel medium‐resolution and multispectral sensor, for quantifying artificial light at night (ALAN). Due to their coarse spatial, spectral or temporal resolution, most of the currently used space‐borne sensors are limited in their contribution to assessments of light pollution at multiple scales and of the ecological and conservation‐relevant effects of ALAN. SDGSAT‐1 now offers new opportunities to map the variability in light intensity and spectra at finer spatial resolution, providing the means to disentangle and characterize different sources of ALAN, and to relate ALAN to local environmental parameters, in situ measurements and surveys. Monitoring direct light emissions at 10–40 m spatial resolution enables scientists to better understand the origins and impacts of light pollution on sensitive species and ecosystems, and assists practitioners in implementing local conservation measures. We demonstrate some key ecological applications of SDGSAT‐1, such as quantifying the exposure of protected areas to light pollution, assessing wildlife corridors and dark refuges in urban areas, and modelling the visibility of light sources to animals. We conclude that SDGSAT‐1, and possibly similar future satellite missions, will significantly advance ecological light pollution research to better understand the environmental impacts of light pollution and to devise strategies to mitigate them.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"54 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144066914","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Alastair Pickering, Santiago Martinez Balvanera, Kate E. Jones, Daniela Hedwig
{"title":"A scalable transfer learning workflow for extracting biological and behavioural insights from forest elephant vocalizations","authors":"Alastair Pickering, Santiago Martinez Balvanera, Kate E. Jones, Daniela Hedwig","doi":"10.1002/rse2.70008","DOIUrl":"https://doi.org/10.1002/rse2.70008","url":null,"abstract":"Animal vocalizations encode rich biological information—such as age, sex, behavioural context and emotional state—making bioacoustic analysis a promising non‐invasive method for assessing welfare and population demography. However, traditional bioacoustic approaches, which rely on manually defined acoustic features, are time‐consuming, require specialized expertise and may introduce subjective bias. These constraints reduce the feasibility of analysing increasingly large datasets generated by passive acoustic monitoring (PAM). Transfer learning with Convolutional Neural Networks (CNNs) offers a scalable alternative by enabling automatic acoustic feature extraction without predefined criteria. Here, we applied four pre‐trained CNNs—two general purpose models (VGGish and YAMNet) and two avian bioacoustic models (Perch and BirdNET)—to African forest elephant (<jats:italic>Loxodonta cyclotis</jats:italic>) recordings. We used a dimensionality reduction algorithm (UMAP) to represent the extracted acoustic features in two dimensions and evaluated these representations across three key tasks: (1) call‐type classification (rumble, roar and trumpet), (2) rumble sub‐type identification and (3) behavioural and demographic analysis. A Random Forest classifier trained on these features achieved near‐perfect accuracy for rumbles, with Perch attaining the highest average accuracy (0.85) across all call types. Clustering the reduced features identified biologically meaningful rumble sub‐types—such as adult female calls linked to logistics—and provided clearer groupings than manual classification. Statistical analyses showed that factors including age and behavioural context significantly influenced call variation (<jats:italic>P</jats:italic> < 0.001), with additional comparisons revealing clear differences among contexts (e.g. nursing, competition, separation), sexes and multiple age classes. Perch and BirdNET consistently outperformed general purpose models when dealing with complex or ambiguous calls. These findings demonstrate that transfer learning enables scalable, reproducible bioacoustic workflows capable of detecting biologically meaningful acoustic variation. Integrating this approach into PAM pipelines can enhance the non‐invasive assessment of population dynamics, behaviour and welfare in acoustically active species.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"219 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143875850","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Vera Thijssen, Marianthi Tangili, Ruth A. Howison, Han Olff
{"title":"Advancing the mapping of vegetation structure in savannas using Sentinel‐1 imagery","authors":"Vera Thijssen, Marianthi Tangili, Ruth A. Howison, Han Olff","doi":"10.1002/rse2.70006","DOIUrl":"https://doi.org/10.1002/rse2.70006","url":null,"abstract":"Vegetation structure monitoring is important for the understanding and conservation of savanna ecosystems. Optical satellite imagery can be used to estimate canopy cover, but provides limited information about the structure of savannas, and is restricted to daytime and clear‐sky captures. Active remote sensing can potentially overcome this. We explore the utility of C‐band synthetic aperture radar imagery for mapping both grassland and woody vegetation structure in savannas. We calibrated Sentinel‐1 VH () and VV () backscatter coefficients and their ratio () to ground‐based estimates of grass biomass, woody canopy volume (<50 000 m<jats:sup>3</jats:sup>/ha) and tree basal area (<15 m<jats:sup>2</jats:sup>/ha) in the Greater Serengeti‐Mara Ecosystem, and simultaneously explored their sensitivity to soil moisture. We show that in particular can be used to estimate grass biomass (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0.54, RMSE = 630 kg/ha, %range = 20.6), woody canopy volume (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0.69, RMSE = 4188 m<jats:sup>3</jats:sup>/ha, %range = 11.8) and tree basal area (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0.44, RMSE = 2.03 m<jats:sup>2</jats:sup>/ha, %range = 18.6) in the dry season, allowing for the extrapolation to regional scale vegetation structure maps. We also introduce new proxies for soil moisture as an option for extending this approach to the wet season using the 90‐day preceding bounded running averages of the Climate Hazards Group InfraRed Precipitation with Station (CHIRPS) and the Multi‐satellitE Retrievals for Global Precipitation Measurement (IMERG) datasets. We discuss the potential of Sentinel‐1 imagery for better understanding of the spatio‐temporal dynamics of vegetation structure in savannas.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"91 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143862136","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}