Owain Barton, Brian D. Gerber, Line S. Cordes, John R. Healey, Graeme Shannon
{"title":"Covariates influence optimal camera‐trap survey design for occupancy modelling","authors":"Owain Barton, Brian D. Gerber, Line S. Cordes, John R. Healey, Graeme Shannon","doi":"10.1002/rse2.70031","DOIUrl":"https://doi.org/10.1002/rse2.70031","url":null,"abstract":"Motion‐activated cameras (‘camera‐traps’) have become indispensable for wildlife monitoring. Data from camera‐trap surveys can be used to make inferences about animal behaviour, space use and population dynamics. Occupancy modelling is a statistical framework commonly used to analyse camera‐trap data, which estimates species occurrence while accounting for imperfect detection. Including covariates in models enables the investigation of relationships between occupancy and the environment. Survey design studies help practitioners decide the number of cameras to deploy, deployment duration and camera positioning. However, existing assessments have generally assumed constant occupancy and detectability (i.e. no covariates were considered), which is unrealistic for most real‐world scenarios. We investigated the effects of covariates on the relationship between survey effort and the combination of accuracy and precision (i.e. error) of occupancy models. Camera‐trap data for a ‘virtual’ species were simulated as a function of randomly generated, site‐ and survey‐specific covariates (e.g. habitat type/quality and temperature, respectively). We then assessed how varying survey design and total effort influenced estimation error with and without covariate information. Increasing the number of cameras consistently reduced error, while longer deployments were only beneficial when the covariate influenced occupancy. When both parameters were affected by covariates, omitting effects on detectability had limited impact on model performance. However, failing to account for effects on occupancy significantly increased error, and none of the predefined thresholds (root mean squared error = 0.15, 0.10 and 0.075) were achievable, even with the maximum survey effort of 9000 camera‐days. These results suggest that increasing survey effort is unlikely to improve model performance unless site‐level conditions are appropriately modelled. Thus, robust study design should consider total effort and the monitoring of covariates across sites to ensure efficient use of time and financial resources.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"64 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145305899","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mariane C. Kaizer, Naiara G. Sales, Thiago H. G. Alvim, Karen B. Strier, Fabiano R. de Melo, Jean P. Boubli, Robert J. Young, Allan D. McDevitt
{"title":"Assessing group size and the demographic composition of a canopy‐dwelling primate, the northern muriqui (Brachyteles hypoxanthus), using arboreal camera trapping and genetic tagging","authors":"Mariane C. Kaizer, Naiara G. Sales, Thiago H. G. Alvim, Karen B. Strier, Fabiano R. de Melo, Jean P. Boubli, Robert J. Young, Allan D. McDevitt","doi":"10.1002/rse2.70035","DOIUrl":"https://doi.org/10.1002/rse2.70035","url":null,"abstract":"Obtaining accurate population measures of endangered species is critical for effective conservation and management actions and to evaluate their success over time. However, determining the population size and demographic composition of most canopy forest‐dwelling species has proven to be challenging. Here, we apply two non‐invasive biomonitoring methods, arboreal camera trap and genetic tagging of fecal samples, to estimate the population size of a critically endangered primate, the northern muriqui (<jats:italic>Brachyteles hypoxanthus</jats:italic>), in the Caparaó National Park, Brazil. When comparing group sizes between camera trapping and genetic tagging, the genetic tagging survey estimated fewer individuals for one of the muriqui groups studied but showed slightly higher population size estimates for the other group. In terms of the cost‐efficiency of both methods, arboreal camera trapping had high initial costs but was more cost‐effective in the long term. Genetic tagging, on the other hand, did not require expensive equipment for data collection but had higher associated expenses for laboratory consumables and data processing. We recommend the use of both methods for northern muriqui monitoring and provide suggestions for improving the implementation of these non‐invasive methods for future routine monitoring. Our findings also highlight the potential of arboreal camera trapping and genetic tagging for other arboreal mammals in tropical forests.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"35 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145311579","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A UAV‐based deep learning pipeline for intertidal macrobenthos monitoring: Behavioral and age classification in Tachypleus tridentatus","authors":"Xiaohai Chen, Yuyuan Bao, Ziwei Ying, Mujiao Xie, Ting Li, Jixing Zou, Jun Shi, Xiaoyong Xie","doi":"10.1002/rse2.70036","DOIUrl":"https://doi.org/10.1002/rse2.70036","url":null,"abstract":"Intertidal macrobenthos are vital bioindicators of coastal ecosystem health due to their ecological roles, limited mobility, and sensitivity to environmental disturbances. However, traditional field‐based monitoring methods are time‐consuming, spatially restricted, and unsuitable for large‐scale ecological surveillance. Integrating unmanned aerial vehicles (UAVs) with deep learning offers a promising alternative for high‐resolution, cost‐effective monitoring. Yet, species‐specific object detection frameworks for mobile macrobenthic fauna remain underdeveloped. <jats:italic>Tachypleus tridentatus</jats:italic>, an endangered “living fossil” with over 430 million years of evolutionary history, serves as a flagship species for intertidal conservation due to its ecological significance and biomedical value. This study develops a customized deep learning pipeline for monitoring <jats:italic>T. tridentatus</jats:italic>, combining UAV‐based image acquisition, automated detection, and ecological trait inference. We constructed the first UAV‐derived dataset of juvenile <jats:italic>T. tridentatus</jats:italic> (<jats:italic>n</jats:italic> = 761) and implemented a convolutional autoencoder for unsupervised behavioral classification, achieving 96% accuracy in distinguishing buried from exposed individuals. A YOLO‐based detection model was optimized using lightweight pruning and a high–low frequency fusion module (HLFM), improving detection accuracy (mAP@50 increased by 1.74%) and computational efficiency. Additionally, we established robust regression models linking crawling trace width to prosomal width (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0.99) and prosomal width to instar stage (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0.91). The inferred instar stages showed no significant deviation across datasets, validating their use as indicators of age structure. By bridging species‐level detection with population‐level ecological inference, this study provides a scalable, field‐deployable framework for monitoring <jats:italic>T. tridentatus</jats:italic> and other intertidal macrobenthic taxa. The approach supports data‐driven conservation strategies and enhances our capacity to assess the status of endangered coastal species in complex intertidal environments.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"117 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145277482","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Michael D. Taylor, Simone Strydom, Matthew W. Fraser, Ana M. M. Sequeira, Gary A. Kendrick
{"title":"Breaking down seagrass fragmentation in a marine heatwave impacted World Heritage Area","authors":"Michael D. Taylor, Simone Strydom, Matthew W. Fraser, Ana M. M. Sequeira, Gary A. Kendrick","doi":"10.1002/rse2.70032","DOIUrl":"https://doi.org/10.1002/rse2.70032","url":null,"abstract":"Marine heatwaves, and other extreme climatic events, are driving mass mortality of habitat‐forming species and substantial ecological change worldwide. However, habitat fragmentation is rarely considered despite its role in structuring seascapes and potential to exacerbate the negative impacts of habitat loss. Here, we quantify fragmentation of globally significant seagrass meadows within the Shark Bay World Heritage Area before and after an unprecedented marine heatwave impacting the Western Australian coastline over the austral summer of 2010/11. We use a spatial pattern index to quantify seagrass fragmentation from satellite‐derived habitat maps (2002, 2010, 2014 and 2016), assess potential predictors of fragmentation and investigate seascape dynamics defined by relationships between seagrass fragmentation and cover change. Our spatiotemporal analysis illustrates widespread fragmentation of seagrass following the marine heatwave, contributing to a dramatic alteration of seascape structure across the World Heritage Area. Fragmentation immediately following the marine heatwave coincided with widespread seagrass loss and was best explained by interactions between a heat stress metric (i.e. degree heating weeks) and depth. Based on the relationship between fragmentation and seagrass cover change, we revealed near‐ubiquitous fragmentation from 2014 to 2016 represents a mixture of long‐term seagrass degradation and evidence of early, patchy recovery. Fragmentation effects are expected to compound the ecological impacts of seagrass mortality following the marine heatwave and prolong recovery. As sea temperatures and the threat of marine heatwaves continue to rise globally, our results highlight the importance of considering fragmentation effects alongside the negative impacts of habitat loss. Our seascape dynamic framework provides a novel approach to define the response of habitat‐forming species to disturbances, including marine heatwaves, that integrates the processes of fragmentation and cover change. This framework provides the opportunity to consider these important processes across a range of threatened ecosystems and identify areas of vulnerability, stability and recovery.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"157 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-10-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145226606","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Vannesa Montoya‐Sánchez, Anna K. Schweiger, Michael Schlund, Gustavo Brant Paterno, Stefan Erasmi, Holger Kreft, Dirk Hölscher, Fabian Brambach, Bambang Irawan, Leti Sundawati, Delphine Clara Zemp
{"title":"Spectral characterization of plant diversity in a biodiversity‐enriched oil palm plantation","authors":"Vannesa Montoya‐Sánchez, Anna K. Schweiger, Michael Schlund, Gustavo Brant Paterno, Stefan Erasmi, Holger Kreft, Dirk Hölscher, Fabian Brambach, Bambang Irawan, Leti Sundawati, Delphine Clara Zemp","doi":"10.1002/rse2.70034","DOIUrl":"https://doi.org/10.1002/rse2.70034","url":null,"abstract":"Assessing plant diversity using remote sensing, including airborne imaging spectroscopy, shows promise for large‐scale biodiversity monitoring in landscape restoration and conservation. Enriching plantations with native trees is a key restoration strategy to enhance biodiversity and ecosystem functions in agricultural lands. In this study, we tested how well imaging spectroscopy characterizes plant diversity in 37 experimental plots of varying sizes and planted diversity levels in a biodiversity‐enriched oil palm plantation in Sumatra, Indonesia. Six years after establishing the plots, we acquired airborne imaging spectroscopy data comprising 160 spectral bands (400–1000 nm, at ~3.7 nm bandwidth) at 0.3 m spatial resolution. We calculated spectral diversity as the variance among image pixels and partitioned spectral diversity into alpha and beta diversity components. After controlling for differences in sampling area through rarefaction, we found no significant relationship between spectral and plant alpha diversity. Further, the relationships between the local contribution of spectral beta diversity and plant beta diversity revealed no significant trends. Spectral variability within plots was substantially higher than among plots (spectral alpha diversity ~82%–87%, spectral beta diversity ~11%–18%). These discrepancies are likely due to the structural dominance of oil palm crowns, which absorbed most of the light, while most of the plant diversity occurring below the oil palm canopy was not detectable by airborne spectroscopy. Our study highlights that remote sensing of plant diversity in ecosystems with strong vertical stratification and high understory diversity, such as agroforests, would benefit from combining data from passive with data from active sensors, such as LiDAR, to capture structural diversity.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"17 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145188399","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Miguel Silva‐Monteiro, Miguel Villoslada, Thaisa Bergamo, Triin Kaasiku, Camilo Carneiro, David Kleijn
{"title":"UAVs unveil the role of small scale vegetation structure on wader nest survival","authors":"Miguel Silva‐Monteiro, Miguel Villoslada, Thaisa Bergamo, Triin Kaasiku, Camilo Carneiro, David Kleijn","doi":"10.1002/rse2.70033","DOIUrl":"https://doi.org/10.1002/rse2.70033","url":null,"abstract":"Several ground‐nesting wader species rely on Baltic coastal meadows for breeding. Drastic reduction in the area of the habitat at the end of the 20th century has been followed by habitat restoration activities over the last 20 years. However, wader populations are not responding as hoped to the current conservation effort. Therefore, identifying which grassland characteristics are essential for waders to select their nesting location and which ones enhance their clutch survival probability is vital to implementing efficient conservation plans. However, many vegetation structural characteristics, such as sward height or heterogeneity, can be logistically complex to measure using traditional methods in relatively large areas, especially considering the highly accurate resolution needed. Here, we assessed several sward characteristics together with other key landscape features by combining very high‐resolution images from unmanned aerial vehicle (UAV) surveys with nest survival monitoring, in five key Estonian coastal grasslands for waders. We found that the main four wader species, Northern Lapwing (<jats:italic>Vanellus vanellus</jats:italic>), Common Redshank (<jats:italic>Tringa totanus</jats:italic>), Common Ringed Plover (<jats:italic>Charadrius hiaticula</jats:italic>) and the Baltic Dunlin (<jats:italic>Calidris alpina schinzii),</jats:italic> do not significantly differ in their nest‐site selection in terms of vegetation height, growth rates, or sward heterogeneity. Yet, we found that vegetation sward height and heterogeneity surrounding the nest sites within a 2‐meter buffer positively increased the daily nest survival probability from 0.883 to 0.979 along the gradients observed. Additionally, the distance between the nest location and flooded areas (≥20m<jats:sup>2</jats:sup>) was negatively correlated, and all variables affected the wader community similarly. Our results signal the need for a higher diversity of sward structures and the importance of constantly flooded areas in Estonian coastal meadows. Moreover, our study highlights the importance of integrating UAV remote sensing techniques within the animal conservation research field to unveil ecological patterns that may remain hidden using more traditional methods.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"4 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145153629","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Rafael Campos, Miha Krofel, Helena Rio‐Maior, Francesco Renna
{"title":"HOWLish: a CNN for automated wolf howl detection","authors":"Rafael Campos, Miha Krofel, Helena Rio‐Maior, Francesco Renna","doi":"10.1002/rse2.70024","DOIUrl":"https://doi.org/10.1002/rse2.70024","url":null,"abstract":"Automated sound‐event detection is crucial for large‐scale passive acoustic monitoring of wildlife, but the availability of ready‐to‐use tools is narrow across taxa. Machine learning is currently the state‐of‐the‐art framework for developing sound‐event detection tools tailored to specific wildlife calls. Gray wolves (<jats:italic>Canis lupus</jats:italic>), a species with intricate management necessities, howl spontaneously for long‐distance intra‐ and inter‐pack communication, which makes them a prime target for passive acoustic monitoring. Yet, there is currently no pre‐trained, open‐access tool that allows reliable automated detection of wolf howls in recorded soundscapes. We collected 50 137 h of soundscape data, where we manually labeled 841 unique howling events. We used this dataset to fine‐tune VGGish—a convolutional neural network trained for audio classification—effectively retraining it for wolf howl detection. HOWLish correctly classified 77% of the wolf howling examples present on our test set, with a false positive rate of 1.74%; still, precision was low (0.006) granted extreme class imbalance (7124:1). During field tests, HOWLish retrieved 81.3% of the observed howling events while offering a 15‐fold reduction in operator time when compared to fully manual detection. This work establishes the baseline for open‐access automated wolf howl detection. HOWLish facilitates remote sensing of wild wolf populations, offering new opportunities in non‐invasive large‐scale monitoring and communication research of wolves. The knowledge gap we addressed here spans across many soniferous taxa, to which our approach also tallies.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"41 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145117005","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Zack Loken, Kevin M. Ringelman, Anne Mini, J. Dale James, Mike Mitchell
{"title":"DuckNet: an open‐source deep learning tool for waterfowl species identification in UAV imagery","authors":"Zack Loken, Kevin M. Ringelman, Anne Mini, J. Dale James, Mike Mitchell","doi":"10.1002/rse2.70028","DOIUrl":"https://doi.org/10.1002/rse2.70028","url":null,"abstract":"Understanding how waterfowl respond to habitat restoration and management activities is crucial for evaluating and refining conservation delivery programs. However, site‐specific waterfowl monitoring is challenging, especially in heavily forested systems such as the Mississippi Alluvial Valley (MAV)—a primary wintering region for waterfowl in North America. We hypothesized that using uncrewed aerial vehicles (UAVs) coupled with deep learning‐based methods for object detection would provide an efficient and effective means for surveying non‐breeding waterfowl on difficult‐to‐access restored wetland sites. Accordingly, during the winters of 2021 and 2022, we surveyed wetland restoration easements in the MAV using a UAV equipped with a dual thermal‐RGB high‐resolution sensor to collect 2360 digital images of non‐breeding waterfowl. We then developed, optimized, and trained a RetinaNet object detection model with a ResNet‐50 backbone to locate and identify seven species of waterfowl drakes, waterfowl hens, and one species of waterbird in the UAV imagery. The final model achieved an average precision and average recall of 88.1% (class ranges from 68.8 to 99.6%) and 89.0% (class ranges from 70.0 to 100%), respectively, at an intersection‐over‐union of 0.5. This study successfully surveys non‐breeding waterfowl in structurally complex and difficult‐to‐access habitats using UAV and, furthermore, provides a functional, open‐source, deep learning‐based object detection framework (DuckNet) for automated detection of waterfowl in UAV imagery. DuckNet provides a user‐friendly interface for running inference on custom images using the model developed here and, additionally, allows users to fine‐tune the model on custom datasets to expand the number of species classes the model can detect. This framework provides managers with an efficient and cost‐effective means to count waterfowl on project sites, thereby improving their capacity to evaluate waterfowl response to wetland restoration efforts.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"78 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145084330","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Léa Enguehard, Birgit Heim, Ulrike Herzschuh, Viktor Dinkel, Glenn Juday, Santosh Panda, Nicola Falco, Jacob Schladebach, Jakob Broers, Stefan Kruse
{"title":"Investigating boreal forest successional stages in Alaska and Northwest Canada using UAV‐LiDAR and RGB and a community detection network","authors":"Léa Enguehard, Birgit Heim, Ulrike Herzschuh, Viktor Dinkel, Glenn Juday, Santosh Panda, Nicola Falco, Jacob Schladebach, Jakob Broers, Stefan Kruse","doi":"10.1002/rse2.70029","DOIUrl":"https://doi.org/10.1002/rse2.70029","url":null,"abstract":"Boreal forests are a key component of the global carbon cycle, forming North America's most extensive biome. Different successional stages in boreal forests have varying levels of ecological values and biodiversity, which in turn affect their functions. A knowledge gap remains concerning the present successional stages, their geographic patterns and possible successions. This study develops a novel application of UAV‐LiDAR and Red Green Blue (RGB) data and network analysis to enhance our understanding of boreal forest succession. Between 2022 and 2024, we collected UAV‐LiDAR and RGB data from 48 forested sites in Alaska and Northwest Canada to (i) identify present successional stages and (ii) deepen our understanding of successional trajectories. We first applied UAV‐derived spectral and structural tree attributes to classify individual trees into plant functional types representative of boreal forest succession, amely, <jats:italic>evergreen</jats:italic> and <jats:italic>deciduous</jats:italic>. Second, we built a forest‐patch network to characterize successional stages and their interactions and assessed future stage transitions. Finally, we applied a simplified forward model to predict future dynamics and highlight different successional trajectories. Our results indicate that tree height and spectral variables are the most influential predictors of plant functional type in random forest algorithms, and high overall accuracies were attained. The network‐based community detection algorithm reveals five interconnected successional stages that could be interpreted as ranging from early to late successional and a disturbed stage. We find that disturbed sites are mainly located in Interior and Southcentral Alaska, while late successional sites are predominant in the southern Canadian sites. Transitional stages are mainly located near the tundra‐taiga boundary. These findings highlight the critical role of disturbances, such as fire or insect outbreaks, in shaping forest succession in Alaska and Northwest Canada.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"135 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145035554","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Maik Henrich, Christian Fiderer, Alisa Klamm, Anja Schneider, Axel Ballmann, Jürgen Stein, Raffael Kratzer, Rudolf Reiner, Sina Greiner, Sönke Twietmeyer, Tobias Rönitz, Volker Spicher, Simon Chamaillé‐Jammes, Vincent Miele, Gaspard Dussert, Marco Heurich
{"title":"Camera traps and deep learning enable efficient large‐scale density estimation of wildlife in temperate forest ecosystems","authors":"Maik Henrich, Christian Fiderer, Alisa Klamm, Anja Schneider, Axel Ballmann, Jürgen Stein, Raffael Kratzer, Rudolf Reiner, Sina Greiner, Sönke Twietmeyer, Tobias Rönitz, Volker Spicher, Simon Chamaillé‐Jammes, Vincent Miele, Gaspard Dussert, Marco Heurich","doi":"10.1002/rse2.70030","DOIUrl":"https://doi.org/10.1002/rse2.70030","url":null,"abstract":"Automated detectors such as camera traps allow the efficient collection of large amounts of data for the monitoring of animal populations, but data processing and classification are a major bottleneck. Deep learning algorithms have gained increasing attention in this context, as they have the potential to dramatically decrease the time and effort required to obtain population density estimates. However, the robustness of such an approach has not yet been evaluated across a wide range of species and study areas. This study evaluated the application of DeepFaune, an open‐source deep learning algorithm for the classification of European animal species and camera trap distance sampling (CTDS) to a year‐round dataset containing 895,019 manually classified photos from 10 protected areas across Germany. For all wild animal species and higher taxonomic groups on which DeepFaune was trained, the algorithm achieved an overall accuracy of 90%. The 95% confidence interval (CI) of the difference between the CTDS estimates based on manual and automated image classification contained zero for all species and seasons with a minimum sample size of 20 independent observations per study area, except for two. Meta‐regression revealed an average difference between the classification methods of −0.005 (95% CI: −0.205–0.196) animals/km<jats:sup>2</jats:sup>. Classification success correlated with the divergence of the population density estimates, but false negative and false positive detections had complex effects on the density estimates via different CTDS parameters. Therefore, metrics of classification performance alone are insufficient to assess the effect of deep learning classifiers on the population density estimation process, which should instead be followed through entirely for proper validation. In general, however, our results demonstrate that readily available deep learning algorithms can be used in largely unsupervised workflows for estimating population densities from camera trap data.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"158 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145035725","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}