Michael D. Taylor, Simone Strydom, Matthew W. Fraser, Ana M. M. Sequeira, Gary A. Kendrick
{"title":"Breaking down seagrass fragmentation in a marine heatwave impacted World Heritage Area","authors":"Michael D. Taylor, Simone Strydom, Matthew W. Fraser, Ana M. M. Sequeira, Gary A. Kendrick","doi":"10.1002/rse2.70032","DOIUrl":"https://doi.org/10.1002/rse2.70032","url":null,"abstract":"Marine heatwaves, and other extreme climatic events, are driving mass mortality of habitat‐forming species and substantial ecological change worldwide. However, habitat fragmentation is rarely considered despite its role in structuring seascapes and potential to exacerbate the negative impacts of habitat loss. Here, we quantify fragmentation of globally significant seagrass meadows within the Shark Bay World Heritage Area before and after an unprecedented marine heatwave impacting the Western Australian coastline over the austral summer of 2010/11. We use a spatial pattern index to quantify seagrass fragmentation from satellite‐derived habitat maps (2002, 2010, 2014 and 2016), assess potential predictors of fragmentation and investigate seascape dynamics defined by relationships between seagrass fragmentation and cover change. Our spatiotemporal analysis illustrates widespread fragmentation of seagrass following the marine heatwave, contributing to a dramatic alteration of seascape structure across the World Heritage Area. Fragmentation immediately following the marine heatwave coincided with widespread seagrass loss and was best explained by interactions between a heat stress metric (i.e. degree heating weeks) and depth. Based on the relationship between fragmentation and seagrass cover change, we revealed near‐ubiquitous fragmentation from 2014 to 2016 represents a mixture of long‐term seagrass degradation and evidence of early, patchy recovery. Fragmentation effects are expected to compound the ecological impacts of seagrass mortality following the marine heatwave and prolong recovery. As sea temperatures and the threat of marine heatwaves continue to rise globally, our results highlight the importance of considering fragmentation effects alongside the negative impacts of habitat loss. Our seascape dynamic framework provides a novel approach to define the response of habitat‐forming species to disturbances, including marine heatwaves, that integrates the processes of fragmentation and cover change. This framework provides the opportunity to consider these important processes across a range of threatened ecosystems and identify areas of vulnerability, stability and recovery.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"157 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-10-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145226606","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Vannesa Montoya‐Sánchez, Anna K. Schweiger, Michael Schlund, Gustavo Brant Paterno, Stefan Erasmi, Holger Kreft, Dirk Hölscher, Fabian Brambach, Bambang Irawan, Leti Sundawati, Delphine Clara Zemp
{"title":"Spectral characterization of plant diversity in a biodiversity‐enriched oil palm plantation","authors":"Vannesa Montoya‐Sánchez, Anna K. Schweiger, Michael Schlund, Gustavo Brant Paterno, Stefan Erasmi, Holger Kreft, Dirk Hölscher, Fabian Brambach, Bambang Irawan, Leti Sundawati, Delphine Clara Zemp","doi":"10.1002/rse2.70034","DOIUrl":"https://doi.org/10.1002/rse2.70034","url":null,"abstract":"Assessing plant diversity using remote sensing, including airborne imaging spectroscopy, shows promise for large‐scale biodiversity monitoring in landscape restoration and conservation. Enriching plantations with native trees is a key restoration strategy to enhance biodiversity and ecosystem functions in agricultural lands. In this study, we tested how well imaging spectroscopy characterizes plant diversity in 37 experimental plots of varying sizes and planted diversity levels in a biodiversity‐enriched oil palm plantation in Sumatra, Indonesia. Six years after establishing the plots, we acquired airborne imaging spectroscopy data comprising 160 spectral bands (400–1000 nm, at ~3.7 nm bandwidth) at 0.3 m spatial resolution. We calculated spectral diversity as the variance among image pixels and partitioned spectral diversity into alpha and beta diversity components. After controlling for differences in sampling area through rarefaction, we found no significant relationship between spectral and plant alpha diversity. Further, the relationships between the local contribution of spectral beta diversity and plant beta diversity revealed no significant trends. Spectral variability within plots was substantially higher than among plots (spectral alpha diversity ~82%–87%, spectral beta diversity ~11%–18%). These discrepancies are likely due to the structural dominance of oil palm crowns, which absorbed most of the light, while most of the plant diversity occurring below the oil palm canopy was not detectable by airborne spectroscopy. Our study highlights that remote sensing of plant diversity in ecosystems with strong vertical stratification and high understory diversity, such as agroforests, would benefit from combining data from passive with data from active sensors, such as LiDAR, to capture structural diversity.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"17 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145188399","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Miguel Silva‐Monteiro, Miguel Villoslada, Thaisa Bergamo, Triin Kaasiku, Camilo Carneiro, David Kleijn
{"title":"UAVs unveil the role of small scale vegetation structure on wader nest survival","authors":"Miguel Silva‐Monteiro, Miguel Villoslada, Thaisa Bergamo, Triin Kaasiku, Camilo Carneiro, David Kleijn","doi":"10.1002/rse2.70033","DOIUrl":"https://doi.org/10.1002/rse2.70033","url":null,"abstract":"Several ground‐nesting wader species rely on Baltic coastal meadows for breeding. Drastic reduction in the area of the habitat at the end of the 20th century has been followed by habitat restoration activities over the last 20 years. However, wader populations are not responding as hoped to the current conservation effort. Therefore, identifying which grassland characteristics are essential for waders to select their nesting location and which ones enhance their clutch survival probability is vital to implementing efficient conservation plans. However, many vegetation structural characteristics, such as sward height or heterogeneity, can be logistically complex to measure using traditional methods in relatively large areas, especially considering the highly accurate resolution needed. Here, we assessed several sward characteristics together with other key landscape features by combining very high‐resolution images from unmanned aerial vehicle (UAV) surveys with nest survival monitoring, in five key Estonian coastal grasslands for waders. We found that the main four wader species, Northern Lapwing (<jats:italic>Vanellus vanellus</jats:italic>), Common Redshank (<jats:italic>Tringa totanus</jats:italic>), Common Ringed Plover (<jats:italic>Charadrius hiaticula</jats:italic>) and the Baltic Dunlin (<jats:italic>Calidris alpina schinzii),</jats:italic> do not significantly differ in their nest‐site selection in terms of vegetation height, growth rates, or sward heterogeneity. Yet, we found that vegetation sward height and heterogeneity surrounding the nest sites within a 2‐meter buffer positively increased the daily nest survival probability from 0.883 to 0.979 along the gradients observed. Additionally, the distance between the nest location and flooded areas (≥20m<jats:sup>2</jats:sup>) was negatively correlated, and all variables affected the wader community similarly. Our results signal the need for a higher diversity of sward structures and the importance of constantly flooded areas in Estonian coastal meadows. Moreover, our study highlights the importance of integrating UAV remote sensing techniques within the animal conservation research field to unveil ecological patterns that may remain hidden using more traditional methods.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"4 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145153629","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Rafael Campos, Miha Krofel, Helena Rio‐Maior, Francesco Renna
{"title":"HOWLish: a CNN for automated wolf howl detection","authors":"Rafael Campos, Miha Krofel, Helena Rio‐Maior, Francesco Renna","doi":"10.1002/rse2.70024","DOIUrl":"https://doi.org/10.1002/rse2.70024","url":null,"abstract":"Automated sound‐event detection is crucial for large‐scale passive acoustic monitoring of wildlife, but the availability of ready‐to‐use tools is narrow across taxa. Machine learning is currently the state‐of‐the‐art framework for developing sound‐event detection tools tailored to specific wildlife calls. Gray wolves (<jats:italic>Canis lupus</jats:italic>), a species with intricate management necessities, howl spontaneously for long‐distance intra‐ and inter‐pack communication, which makes them a prime target for passive acoustic monitoring. Yet, there is currently no pre‐trained, open‐access tool that allows reliable automated detection of wolf howls in recorded soundscapes. We collected 50 137 h of soundscape data, where we manually labeled 841 unique howling events. We used this dataset to fine‐tune VGGish—a convolutional neural network trained for audio classification—effectively retraining it for wolf howl detection. HOWLish correctly classified 77% of the wolf howling examples present on our test set, with a false positive rate of 1.74%; still, precision was low (0.006) granted extreme class imbalance (7124:1). During field tests, HOWLish retrieved 81.3% of the observed howling events while offering a 15‐fold reduction in operator time when compared to fully manual detection. This work establishes the baseline for open‐access automated wolf howl detection. HOWLish facilitates remote sensing of wild wolf populations, offering new opportunities in non‐invasive large‐scale monitoring and communication research of wolves. The knowledge gap we addressed here spans across many soniferous taxa, to which our approach also tallies.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"41 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145117005","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Zack Loken, Kevin M. Ringelman, Anne Mini, J. Dale James, Mike Mitchell
{"title":"DuckNet: an open‐source deep learning tool for waterfowl species identification in UAV imagery","authors":"Zack Loken, Kevin M. Ringelman, Anne Mini, J. Dale James, Mike Mitchell","doi":"10.1002/rse2.70028","DOIUrl":"https://doi.org/10.1002/rse2.70028","url":null,"abstract":"Understanding how waterfowl respond to habitat restoration and management activities is crucial for evaluating and refining conservation delivery programs. However, site‐specific waterfowl monitoring is challenging, especially in heavily forested systems such as the Mississippi Alluvial Valley (MAV)—a primary wintering region for waterfowl in North America. We hypothesized that using uncrewed aerial vehicles (UAVs) coupled with deep learning‐based methods for object detection would provide an efficient and effective means for surveying non‐breeding waterfowl on difficult‐to‐access restored wetland sites. Accordingly, during the winters of 2021 and 2022, we surveyed wetland restoration easements in the MAV using a UAV equipped with a dual thermal‐RGB high‐resolution sensor to collect 2360 digital images of non‐breeding waterfowl. We then developed, optimized, and trained a RetinaNet object detection model with a ResNet‐50 backbone to locate and identify seven species of waterfowl drakes, waterfowl hens, and one species of waterbird in the UAV imagery. The final model achieved an average precision and average recall of 88.1% (class ranges from 68.8 to 99.6%) and 89.0% (class ranges from 70.0 to 100%), respectively, at an intersection‐over‐union of 0.5. This study successfully surveys non‐breeding waterfowl in structurally complex and difficult‐to‐access habitats using UAV and, furthermore, provides a functional, open‐source, deep learning‐based object detection framework (DuckNet) for automated detection of waterfowl in UAV imagery. DuckNet provides a user‐friendly interface for running inference on custom images using the model developed here and, additionally, allows users to fine‐tune the model on custom datasets to expand the number of species classes the model can detect. This framework provides managers with an efficient and cost‐effective means to count waterfowl on project sites, thereby improving their capacity to evaluate waterfowl response to wetland restoration efforts.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"78 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145084330","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Léa Enguehard, Birgit Heim, Ulrike Herzschuh, Viktor Dinkel, Glenn Juday, Santosh Panda, Nicola Falco, Jacob Schladebach, Jakob Broers, Stefan Kruse
{"title":"Investigating boreal forest successional stages in Alaska and Northwest Canada using UAV‐LiDAR and RGB and a community detection network","authors":"Léa Enguehard, Birgit Heim, Ulrike Herzschuh, Viktor Dinkel, Glenn Juday, Santosh Panda, Nicola Falco, Jacob Schladebach, Jakob Broers, Stefan Kruse","doi":"10.1002/rse2.70029","DOIUrl":"https://doi.org/10.1002/rse2.70029","url":null,"abstract":"Boreal forests are a key component of the global carbon cycle, forming North America's most extensive biome. Different successional stages in boreal forests have varying levels of ecological values and biodiversity, which in turn affect their functions. A knowledge gap remains concerning the present successional stages, their geographic patterns and possible successions. This study develops a novel application of UAV‐LiDAR and Red Green Blue (RGB) data and network analysis to enhance our understanding of boreal forest succession. Between 2022 and 2024, we collected UAV‐LiDAR and RGB data from 48 forested sites in Alaska and Northwest Canada to (i) identify present successional stages and (ii) deepen our understanding of successional trajectories. We first applied UAV‐derived spectral and structural tree attributes to classify individual trees into plant functional types representative of boreal forest succession, amely, <jats:italic>evergreen</jats:italic> and <jats:italic>deciduous</jats:italic>. Second, we built a forest‐patch network to characterize successional stages and their interactions and assessed future stage transitions. Finally, we applied a simplified forward model to predict future dynamics and highlight different successional trajectories. Our results indicate that tree height and spectral variables are the most influential predictors of plant functional type in random forest algorithms, and high overall accuracies were attained. The network‐based community detection algorithm reveals five interconnected successional stages that could be interpreted as ranging from early to late successional and a disturbed stage. We find that disturbed sites are mainly located in Interior and Southcentral Alaska, while late successional sites are predominant in the southern Canadian sites. Transitional stages are mainly located near the tundra‐taiga boundary. These findings highlight the critical role of disturbances, such as fire or insect outbreaks, in shaping forest succession in Alaska and Northwest Canada.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"135 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145035554","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Maik Henrich, Christian Fiderer, Alisa Klamm, Anja Schneider, Axel Ballmann, Jürgen Stein, Raffael Kratzer, Rudolf Reiner, Sina Greiner, Sönke Twietmeyer, Tobias Rönitz, Volker Spicher, Simon Chamaillé‐Jammes, Vincent Miele, Gaspard Dussert, Marco Heurich
{"title":"Camera traps and deep learning enable efficient large‐scale density estimation of wildlife in temperate forest ecosystems","authors":"Maik Henrich, Christian Fiderer, Alisa Klamm, Anja Schneider, Axel Ballmann, Jürgen Stein, Raffael Kratzer, Rudolf Reiner, Sina Greiner, Sönke Twietmeyer, Tobias Rönitz, Volker Spicher, Simon Chamaillé‐Jammes, Vincent Miele, Gaspard Dussert, Marco Heurich","doi":"10.1002/rse2.70030","DOIUrl":"https://doi.org/10.1002/rse2.70030","url":null,"abstract":"Automated detectors such as camera traps allow the efficient collection of large amounts of data for the monitoring of animal populations, but data processing and classification are a major bottleneck. Deep learning algorithms have gained increasing attention in this context, as they have the potential to dramatically decrease the time and effort required to obtain population density estimates. However, the robustness of such an approach has not yet been evaluated across a wide range of species and study areas. This study evaluated the application of DeepFaune, an open‐source deep learning algorithm for the classification of European animal species and camera trap distance sampling (CTDS) to a year‐round dataset containing 895,019 manually classified photos from 10 protected areas across Germany. For all wild animal species and higher taxonomic groups on which DeepFaune was trained, the algorithm achieved an overall accuracy of 90%. The 95% confidence interval (CI) of the difference between the CTDS estimates based on manual and automated image classification contained zero for all species and seasons with a minimum sample size of 20 independent observations per study area, except for two. Meta‐regression revealed an average difference between the classification methods of −0.005 (95% CI: −0.205–0.196) animals/km<jats:sup>2</jats:sup>. Classification success correlated with the divergence of the population density estimates, but false negative and false positive detections had complex effects on the density estimates via different CTDS parameters. Therefore, metrics of classification performance alone are insufficient to assess the effect of deep learning classifiers on the population density estimation process, which should instead be followed through entirely for proper validation. In general, however, our results demonstrate that readily available deep learning algorithms can be used in largely unsupervised workflows for estimating population densities from camera trap data.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"158 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145035725","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Johannes N. Wiegers, Kathryn E. Barry, Marijke van Kuijk
{"title":"Inferring camera trap detection zones for rare species using species‐ and camera‐specific traits: a meta‐level analysis","authors":"Johannes N. Wiegers, Kathryn E. Barry, Marijke van Kuijk","doi":"10.1002/rse2.70027","DOIUrl":"https://doi.org/10.1002/rse2.70027","url":null,"abstract":"Camera trapping is a vital tool for wildlife monitoring. Accurately estimating a camera's detection zone, the area where animals are detected, is essential, particularly for calculating population densities of unmarked species. However, obtaining enough detection events to estimate detection zones accurately remains difficult, particularly for rare species. Given that detection zones are influenced by species‐ and camera‐specific traits, it may be possible to infer detection zones from these traits when data are scarce. We conducted a meta‐level analysis to assess how the number of detection events, species traits and site‐specific variables influence the estimation of the effective camera trap detection distance and angle. We reviewed published studies on detection zones, performed a power analysis to estimate the sample sizes required for accurate and precise estimates and used mixed‐effects models to test whether detection zones can be predicted from biological and technical traits. Our results show that c. 50 detection events are needed to achieve error rates below 10%. The mixed‐effects models explained 81% and 85% of the variation in effective detection distance and angle, respectively. Key predictors of detection distance included body mass, right‐truncation distance and camera brand, while angle was predicted by camera brand and installation height. Importantly, we demonstrate that combining model‐based predictions with limited empirical data (fewer than 25 detections) can reduce estimation error to below 15% for rare species. This study highlights that detection zones can be predicted not only within, but also across, studies using shared traits and that the right‐truncation distance is a useful metric to account for habitat‐specific visibility. These findings enhance the utility of detection zones in ecological studies and support better study design, especially for rare or understudied species.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"31 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145002780","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lindsay Veazey, Christopher Latty, Zoey Chapman, Tuula E. Hollmen
{"title":"Applying computer vision to accelerate monitoring and analysis of bird incubation behaviors: a case study using common eider nest camera footage","authors":"Lindsay Veazey, Christopher Latty, Zoey Chapman, Tuula E. Hollmen","doi":"10.1002/rse2.70022","DOIUrl":"https://doi.org/10.1002/rse2.70022","url":null,"abstract":"Advances in camera and data storage technology have revolutionized the ability of scientists to acquire large volumes of finely resolved wildlife monitoring data. This is especially valuable for breeding bird research, which often requires and benefits from continuous nest monitoring, which may extend a month or more. Though high‐quality imagery may yield valuable insights, the sheer volume of data can create processing bottlenecks. Furthermore, achieving uniformity across projects and years is difficult given individual‐level differences in data processing by manual reviewers. To address this problem, we paired a custom trained You Only Look Once version 7 (YOLOv7) model with the StrongSORT tracking algorithm to analyze videos of nesting common eiders (<jats:italic>Somateria mollissima</jats:italic>) collected from barrier islands along the Beaufort Sea coast in Alaska. We used our computer vision pipeline to process footage three times faster than manual review while matching human observer accuracy in recording nest attendance and disturbances. To evaluate the effectiveness of our trained pipeline, we analyzed novel footage from a different year. The automated part of the pipeline performed well when birds were relatively large in the frame. However, performance declined for birds occupying a small frame area, which occurred when the camera was farther away from the nest and not zoomed. When birds are smaller in the frame, they are more susceptible to being obscured by rain or fog on the lens, as well as by other birds positioned in front of them. Additionally, detecting birds that occupy a small area of the frame can be more challenging in complex backgrounds, particularly under difficult lighting conditions, such as when the sun backlights the bird, or due to specific behaviors, like when birds hunker down to minimize their silhouette in response to perceived threats. To enhance performance, we recommend that researchers position cameras closer to nests whenever feasible or utilize zoom lenses. Importantly, our pipeline is designed to be species‐agnostic, allowing for easy adaptation to various nesting bird species.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"9 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144906153","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Rachael S. Leeman, Robert S. Davis, Antonio Uzal, Heinrich Neumeyer, Rebecca A. Garbett, Joshua P. Twining, Richard W. Yarnell
{"title":"Tourist sightings improve the precision of camera trap‐derived density estimates using spatial capture‐recapture models","authors":"Rachael S. Leeman, Robert S. Davis, Antonio Uzal, Heinrich Neumeyer, Rebecca A. Garbett, Joshua P. Twining, Richard W. Yarnell","doi":"10.1002/rse2.70025","DOIUrl":"https://doi.org/10.1002/rse2.70025","url":null,"abstract":"Spatial capture‐recapture (SCR) provides the gold standard for robust population estimates where animals are individually identifiable. Sampling for large carnivores is often conducted over short timeframes to meet assumptions of population closure. As large carnivores are often elusive and found at low densities, surveys often result in low numbers of unique individuals captured and limited spatial recaptures, which can lead to convergence and parameter identifiability issues. In areas of high tourism footfall, additional spatial capture information can be provided by tourists. We supplemented individual encounter history data from a camera trap‐based monitoring programme for leopards (<jats:italic>Panthera pardus</jats:italic>) with tourist sighting data within multi‐session SCR models; we evaluated the benefits of combining multiple data sources. Integrating tourist observations improved the precision of estimates (Half Relative Confidence Interval Width: Combined = 23.1%), resulting in an overall density estimate of 7.02 leopards per 100 km<jats:sup>2</jats:sup> (95% CI: 5.59–8.84 per 100 km<jats:sup>2</jats:sup>). Tourist‐derived methods were 92.5% cheaper than camera trapping, highlighting the cost‐efficiency of supplementing camera trap surveys with this source of data in areas with high tourism activity. This study demonstrates that combining structured survey data from camera traps with unstructured tourist‐derived images improves resultant density estimates compared to using either method alone. Supplementing structured camera trapping data with tourist images in areas of high tourism activity can offer improvements in scalability by increasing spatial and temporal coverage of sampling, with limited additional costs and improved precision in density estimates. To further enhance the reliability of these methods, we provide recommendations for improving citizen science reporting for integration into SCR frameworks.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"15 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144906088","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}