Remote Sensing in Ecology and Conservation最新文献

筛选
英文 中文
A scalable transfer learning workflow for extracting biological and behavioural insights from forest elephant vocalizations
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-04-25 DOI: 10.1002/rse2.70008
Alastair Pickering, Santiago Martinez Balvanera, Kate E. Jones, Daniela Hedwig
{"title":"A scalable transfer learning workflow for extracting biological and behavioural insights from forest elephant vocalizations","authors":"Alastair Pickering, Santiago Martinez Balvanera, Kate E. Jones, Daniela Hedwig","doi":"10.1002/rse2.70008","DOIUrl":"https://doi.org/10.1002/rse2.70008","url":null,"abstract":"Animal vocalizations encode rich biological information—such as age, sex, behavioural context and emotional state—making bioacoustic analysis a promising non‐invasive method for assessing welfare and population demography. However, traditional bioacoustic approaches, which rely on manually defined acoustic features, are time‐consuming, require specialized expertise and may introduce subjective bias. These constraints reduce the feasibility of analysing increasingly large datasets generated by passive acoustic monitoring (PAM). Transfer learning with Convolutional Neural Networks (CNNs) offers a scalable alternative by enabling automatic acoustic feature extraction without predefined criteria. Here, we applied four pre‐trained CNNs—two general purpose models (VGGish and YAMNet) and two avian bioacoustic models (Perch and BirdNET)—to African forest elephant (<jats:italic>Loxodonta cyclotis</jats:italic>) recordings. We used a dimensionality reduction algorithm (UMAP) to represent the extracted acoustic features in two dimensions and evaluated these representations across three key tasks: (1) call‐type classification (rumble, roar and trumpet), (2) rumble sub‐type identification and (3) behavioural and demographic analysis. A Random Forest classifier trained on these features achieved near‐perfect accuracy for rumbles, with Perch attaining the highest average accuracy (0.85) across all call types. Clustering the reduced features identified biologically meaningful rumble sub‐types—such as adult female calls linked to logistics—and provided clearer groupings than manual classification. Statistical analyses showed that factors including age and behavioural context significantly influenced call variation (<jats:italic>P</jats:italic> &lt; 0.001), with additional comparisons revealing clear differences among contexts (e.g. nursing, competition, separation), sexes and multiple age classes. Perch and BirdNET consistently outperformed general purpose models when dealing with complex or ambiguous calls. These findings demonstrate that transfer learning enables scalable, reproducible bioacoustic workflows capable of detecting biologically meaningful acoustic variation. Integrating this approach into PAM pipelines can enhance the non‐invasive assessment of population dynamics, behaviour and welfare in acoustically active species.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"219 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143875850","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Advancing the mapping of vegetation structure in savannas using Sentinel‐1 imagery 利用哨兵-1 图像推进绘制热带草原植被结构图的工作
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-04-22 DOI: 10.1002/rse2.70006
Vera Thijssen, Marianthi Tangili, Ruth A. Howison, Han Olff
{"title":"Advancing the mapping of vegetation structure in savannas using Sentinel‐1 imagery","authors":"Vera Thijssen, Marianthi Tangili, Ruth A. Howison, Han Olff","doi":"10.1002/rse2.70006","DOIUrl":"https://doi.org/10.1002/rse2.70006","url":null,"abstract":"Vegetation structure monitoring is important for the understanding and conservation of savanna ecosystems. Optical satellite imagery can be used to estimate canopy cover, but provides limited information about the structure of savannas, and is restricted to daytime and clear‐sky captures. Active remote sensing can potentially overcome this. We explore the utility of C‐band synthetic aperture radar imagery for mapping both grassland and woody vegetation structure in savannas. We calibrated Sentinel‐1 VH () and VV () backscatter coefficients and their ratio () to ground‐based estimates of grass biomass, woody canopy volume (&lt;50 000 m<jats:sup>3</jats:sup>/ha) and tree basal area (&lt;15 m<jats:sup>2</jats:sup>/ha) in the Greater Serengeti‐Mara Ecosystem, and simultaneously explored their sensitivity to soil moisture. We show that in particular can be used to estimate grass biomass (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0.54, RMSE = 630 kg/ha, %range = 20.6), woody canopy volume (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0.69, RMSE = 4188 m<jats:sup>3</jats:sup>/ha, %range = 11.8) and tree basal area (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0.44, RMSE = 2.03 m<jats:sup>2</jats:sup>/ha, %range = 18.6) in the dry season, allowing for the extrapolation to regional scale vegetation structure maps. We also introduce new proxies for soil moisture as an option for extending this approach to the wet season using the 90‐day preceding bounded running averages of the Climate Hazards Group InfraRed Precipitation with Station (CHIRPS) and the Multi‐satellitE Retrievals for Global Precipitation Measurement (IMERG) datasets. We discuss the potential of Sentinel‐1 imagery for better understanding of the spatio‐temporal dynamics of vegetation structure in savannas.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"91 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143862136","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Object detection‐assisted workflow facilitates cryptic snake monitoring 物体探测辅助工作流程有助于监测隐蛇
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-04-21 DOI: 10.1002/rse2.70009
Storm Miller, Michael Kirkland, Kristen M. Hart, Robert A. McCleery
{"title":"Object detection‐assisted workflow facilitates cryptic snake monitoring","authors":"Storm Miller, Michael Kirkland, Kristen M. Hart, Robert A. McCleery","doi":"10.1002/rse2.70009","DOIUrl":"https://doi.org/10.1002/rse2.70009","url":null,"abstract":"Camera traps are an important tool used to study rare and cryptic animals, including snakes. Time‐lapse photography can be particularly useful for studying snakes that often fail to trigger a camera's infrared motion sensor due to their ectothermic nature. However, the large datasets produced by time‐lapse photography require labor‐intensive classification, limiting their use in large‐scale studies. While many artificial intelligence‐based object detection models are effective at identifying mammals in images, their ability to detect snakes is unproven. Here, we used camera data to evaluate the efficacy of an object detection model to rapidly and accurately detect snakes. We classified images manually to the species level and compared this with a hybrid review workflow where the model removed blank images followed by a manual review. Using a ≥0.05 model confidence threshold, our hybrid review workflow correctly identified 94.5% of blank images, completed image classification 6× faster, and detected large (&gt;66 cm) snakes as well as manual review. Conversely, the hybrid review method often failed to detect all instances of a snake in a string of images and detected fewer small (&lt;66 cm) snakes than manual review. However, most relevant ecological information requires only a single detection in a sequence of images, and study design changes could likely improve the detection of smaller snakes. Our findings suggest that an object detection‐assisted hybrid workflow can greatly reduce time spent manually classifying data‐heavy time‐lapse snake studies and facilitate ecological monitoring for large snakes.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"68 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143853540","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Towards edge processing of images from insect camera traps
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-04-17 DOI: 10.1002/rse2.70007
Kim Bjerge, Henrik Karstoft, Toke T. Høye
{"title":"Towards edge processing of images from insect camera traps","authors":"Kim Bjerge, Henrik Karstoft, Toke T. Høye","doi":"10.1002/rse2.70007","DOIUrl":"https://doi.org/10.1002/rse2.70007","url":null,"abstract":"Insects represent nearly half of all known multicellular species, but knowledge about them lags behind for most vertebrate species. In part for this reason, they are often neglected in biodiversity conservation policies and practice. Computer vision tools, such as insect camera traps, for automated monitoring have the potential to revolutionize insect study and conservation. To further advance insect camera trapping and the analysis of their image data, effective image processing pipelines are needed. In this paper, we present a flexible and fast processing pipeline designed to analyse these recordings by detecting, tracking and classifying nocturnal insects in a broad taxonomy of 15 insect classes and resolution of individual moth species. A classifier with anomaly detection is proposed to filter dark, blurred or partially visible insects that will be uncertain to classify correctly. A simple track‐by‐detection algorithm is proposed to track classified insects by incorporating feature embeddings, distance and area cost. We evaluated the computational speed and power performance of different edge computing devices (Raspberry Pi's and NVIDIA Jetson Nano) and compared various time‐lapse (TL) strategies with tracking. The minimum difference of detections was found for 2‐min TL intervals compared to tracking with 0.5 frames per second; however, for insects with fewer than one detection per night, the Pearson correlation decreases. Shifting from tracking to TL monitoring would reduce the number of recorded images and would allow for edge processing of images in real‐time on a camera trap with Raspberry Pi. The Jetson Nano is the most energy‐efficient solution, capable of real‐time tracking at nearly 0.5 fps. Our processing pipeline was applied to more than 5.7 million images recorded at 0.5 frames per second from 12 light camera traps during two full seasons located in diverse habitats, including bogs, heaths and forests. Our results thus show the scalability of insect camera traps.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"123 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-04-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143847243","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Application of computer vision for off‐highway vehicle route detection: A case study in Mojave desert tortoise habitat
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-04-07 DOI: 10.1002/rse2.70004
Alexander J. Robillard, Madeline Standen, Noah Giebink, Mark Spangler, Amy C. Collins, Brian Folt, Andrew Maguire, Elissa M. Olimpi, Brett G. Dickson
{"title":"Application of computer vision for off‐highway vehicle route detection: A case study in Mojave desert tortoise habitat","authors":"Alexander J. Robillard, Madeline Standen, Noah Giebink, Mark Spangler, Amy C. Collins, Brian Folt, Andrew Maguire, Elissa M. Olimpi, Brett G. Dickson","doi":"10.1002/rse2.70004","DOIUrl":"https://doi.org/10.1002/rse2.70004","url":null,"abstract":"Driving off‐highway vehicles (OHVs), which contributes to habitat degradation and fragmentation, is a common recreational activity in the United States and other parts of the world, particularly in desert environments with fragile ecosystems. Although habitat degradation and mortality from the expansion of OHV networks are thought to have major impacts on desert species, comprehensive maps of OHV route networks and their changes are poorly understood. To better understand how OHV route networks have evolved in the Mojave Desert ecoregion, we developed a computer vision approach to estimate OHV route location and density across the range of the Mojave desert tortoise (<jats:italic>Gopherus agassizii</jats:italic>). We defined OHV routes as non‐paved, linear features, including designated routes and washes in the presence of non‐paved routes. Using contemporary (<jats:italic>n</jats:italic> = 1499) and historical (<jats:italic>n</jats:italic> = 1148) aerial images, we trained and validated three convolutional neural network (CNN) models. We cross‐examined each model on sets of independently curated data and selected the highest performing model to generate predictions across the tortoise's range. When evaluated against a ‘hybrid’ test set (<jats:italic>n</jats:italic> = 1807 images), the final hybrid model achieved an accuracy of 77%. We then applied our model to remotely sensed imagery from across the tortoise's range and generated spatial layers of OHV route density for the 1970s, 1980s, 2010s, and 2020s. We examined OHV route density within tortoise conservation areas (TCA) and recovery units (RU) within the range of the species. Results showed an increase in the OHV route density in both TCAs (8.45%) and RUs (7.85%) from 1980 to 2020. Ordinal logistic regression indicated a strong correlation (OR = 1.01, <jats:italic>P</jats:italic> &lt; 0.001) between model outputs and ground‐truthed OHV maps from the study region. Our computer vision approach and mapped results can inform conservation strategies and management aimed at mitigating the adverse impacts of OHV activity on sensitive ecosystems.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"89 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-04-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143798030","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Woody cover and geology as regional‐scale determinants of semi‐arid savanna stability
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-03-28 DOI: 10.1002/rse2.70005
Liezl Mari Vermeulen, Koenraad Van Meerbeek, Paulo Negri Bernardino, Jasper Slingsby, Bruno Verbist, Ben Somers
{"title":"Woody cover and geology as regional‐scale determinants of semi‐arid savanna stability","authors":"Liezl Mari Vermeulen, Koenraad Van Meerbeek, Paulo Negri Bernardino, Jasper Slingsby, Bruno Verbist, Ben Somers","doi":"10.1002/rse2.70005","DOIUrl":"https://doi.org/10.1002/rse2.70005","url":null,"abstract":"Savannas, defined by a balance of woody and herbaceous vegetation, are vital for global biodiversity and carbon sequestration. Yet, their stability is increasingly at risk due to climate change and human impacts. The responses of these ecosystems to extreme drought events remain poorly understood, especially in relation to the regional variations in soil, terrain, climate history and disturbance legacy. This study analysed time series of a vegetation index, derived from remote sensing data, to quantify ecosystem stability metrics, i.e., resistance and resilience, in response to a major drought event in the semi‐arid savanna of the Kruger National Park, South Africa. Using Bayesian Generalized Linear Models, we assessed the influence of ecosystem traits, past extreme climate events, fire history and herbivory on regional patterns of drought resistance and resilience. Our results show that sandier granite soils dominated by trees have higher drought resistance, supported by the ability of deep‐rooted water access. In contrast, grassier savanna landscapes on basalt soils proved more drought resilient, with rapid vegetation recovery post‐drought. The effects of woody cover on ecosystem drought response are mediated by differences in historical fire regimes, elephant presence and climate legacy, underscoring the complex, context‐dependent nature of savanna landscape response to drought. This research deepens our understanding of savanna stability by clarifying the role of regional drivers, like fire and climate, alongside long‐term factors, like soil composition and woody cover. With droughts projected to increase in frequency and severity in arid and semi‐arid savannas, it also highlights remote sensing as a robust tool for regional‐scale analysis of drought responses, offering a valuable complement to field‐based experiments that can guide effective management and adaptive strategies.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"18 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143734262","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
How to achieve accurate wildlife detection by using vehicle‐mounted mobile monitoring images and deep learning? 如何利用车载移动监控图像和深度学习实现对野生动物的精确检测?
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-03-14 DOI: 10.1002/rse2.70003
Leilei Shi, Jixi Gao, Fei Cao, Wenming Shen, Yue Wu, Kai Liu, Zheng Zhang
{"title":"How to achieve accurate wildlife detection by using vehicle‐mounted mobile monitoring images and deep learning?","authors":"Leilei Shi, Jixi Gao, Fei Cao, Wenming Shen, Yue Wu, Kai Liu, Zheng Zhang","doi":"10.1002/rse2.70003","DOIUrl":"https://doi.org/10.1002/rse2.70003","url":null,"abstract":"With the advancement of artificial intelligence (AI) technologies, vehicle‐mounted mobile monitoring systems have become increasingly integrated into wildlife monitoring practices. However, images captured through these systems often present challenges such as low resolution, small target sizes, and partial occlusions. Consequently, detecting animal targets using conventional deep‐learning networks is challenging. To address these challenges, this paper presents an enhanced YOLOv7 model, referred to as YOLOv7(sr‐sm), which incorporates a super‐resolution (SR) reconstruction module and a small object optimization module. The YOLOv7(sr‐sm) model introduces a super‐resolution reconstruction module that leverages generative adversarial networks (GANs) to reconstruct high‐resolution details from blurry animal images. Additionally, an attention mechanism is integrated into the Neck and Head of YOLOv7 to form a small object optimization module, which enhances the model's ability to detect and locate densely packed small targets. Using a vehicle‐mounted mobile monitoring system, images of four wildlife taxa—sheep, birds, deer, and antelope —were captured on the Tibetan Plateau. These images were combined with publicly available high‐resolution wildlife photographs to create a wildlife test dataset. Experiments were conducted on this dataset, comparing the YOLOv7(sr‐sm) model with eight popular object detection models. The results demonstrate significant improvements in precision, recall, and mean Average Precision (mAP), with YOLOv7(sr‐sm) achieving 93.9%, 92.1%, and 92.3%, respectively. Furthermore, compared to the newly released YOLOv8l model, YOLOv7(sr‐sm) outperforms it by 9.3%, 2.1%, and 4.5% in these three metrics while also exhibiting superior parameter efficiency and higher inference speeds. The YOLOv7(sr‐sm) model architecture can accurately locate and identify blurry animal targets in vehicle‐mounted monitoring images, serving as a reliable tool for animal identification and counting in mobile monitoring systems. These findings provide significant technological support for the application of intelligent monitoring techniques in biodiversity conservation efforts.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"9 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143627562","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Bridging the gap in deep seafloor management: Ultra fine‐scale ecological habitat characterization of large seascapes
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-03-12 DOI: 10.1002/rse2.70002
Ole Johannes Ringnander Sørensen, Itai van Rijn, Shai Einbinder, Hagai Nativ, Aviad Scheinin, Ziv Zemah‐Shamir, Eyal Bigal, Leigh Livne, Anat Tsemel, Or M. Bialik, Gleb Papeer, Dan Tchernov, Yizhaq Makovsky
{"title":"Bridging the gap in deep seafloor management: Ultra fine‐scale ecological habitat characterization of large seascapes","authors":"Ole Johannes Ringnander Sørensen, Itai van Rijn, Shai Einbinder, Hagai Nativ, Aviad Scheinin, Ziv Zemah‐Shamir, Eyal Bigal, Leigh Livne, Anat Tsemel, Or M. Bialik, Gleb Papeer, Dan Tchernov, Yizhaq Makovsky","doi":"10.1002/rse2.70002","DOIUrl":"https://doi.org/10.1002/rse2.70002","url":null,"abstract":"The United Nations' sustainable development goal to designate 30% of the oceans as marine protected areas by 2030 requires practical management tools, and in turn ecologically meaningful mapping of the seafloor. Particularly challenging is the mesophotic zone, a critical component of the marine system, a biodiversity hotspot, and a potential refuge. Here, we introduce a novel seafloor habitat management workflow, integrating cm‐scale synthetic aperture sonar (SAS) and multibeam bathymetry surveying with efficient ecotope characterization. In merely 6 h, we mapped ~5 km<jats:sup>2</jats:sup> of a complex mesophotic reef at sub‐metric resolution. Applying a deep learning classifier on the SAS imagery, we classified four habitats with an accuracy of 84% and defined relevant fine‐scale ecotones. Visual census with precise in situ sampling guided by SAS images for navigation were utilized for ecological characterization of mapped units. Our preliminary fish surveys indicate the ecological importance of highly complex areas and rock/sand ecotones. These less abundant habitats would be largely underrepresented if surveying the area without prior consideration. Thus, our approach is demonstrated to generate scalable habitat maps at resolutions pertinent to relevant biotas, previously inaccessible in the mesophotic, advancing ecological modeling and management of large seascapes.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"11 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143607921","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Automated extraction of right whale morphometric data from drone aerial photographs
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-03-12 DOI: 10.1002/rse2.70001
Chhandak Bagchi, Josh Medina, Duncan J. Irschick, Subhransu Maji, Fredrik Christiansen
{"title":"Automated extraction of right whale morphometric data from drone aerial photographs","authors":"Chhandak Bagchi, Josh Medina, Duncan J. Irschick, Subhransu Maji, Fredrik Christiansen","doi":"10.1002/rse2.70001","DOIUrl":"https://doi.org/10.1002/rse2.70001","url":null,"abstract":"Aerial photogrammetry is a popular non‐invasive tool to measure the size, body morphometrics and body condition of wild animals. While the method can generate large datasets quickly, the lack of efficient processing tools can create bottlenecks that delay management actions. We developed a machine learning algorithm to automatically measure body morphometrics (body length and widths) of southern right whales (Eubalaena australis, SRWs) from aerial photographs (<jats:italic>n</jats:italic> = 8,958) collected by unmanned aerial vehicles in Australia. Our approach utilizes two Mask R‐CNN detection models to: (i) generate masks for each whale and (ii) estimate points along the whale's axis. We annotated a dataset of 468 images containing 638 whales to train our models. To evaluate the accuracy of our machine learning approach, we compared the model‐generated body morphometrics to manual measurements. The influence of picture quality (whale posture and water clarity) was also assessed. The model‐generated body length estimates were slightly negatively biased (median error of −1.3%), whereas the body volume estimates had a small (median error of 6.5%) positive bias. After correcting both biases, the resulting model‐generated body length and volume estimates had mean absolute errors of 0.85% (SD = 0.75) and 6.88% (SD = 6.57), respectively. The magnitude of the errors decreased as picture quality increased. When using the model‐generated data to quantify intra‐seasonal changes in body condition of SRW females, we obtained a similar slope parameter (−0.001843, SE = 0.000095) as derived from manual measurements (−0.001565, SE = 0.000079). This indicates that our approach was able to accurately capture temporal trends in body condition at a population level.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"54 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143607922","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Quantifying nocturnal bird migration using acoustics: opportunities and challenges
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-03-11 DOI: 10.1002/rse2.433
Siméon Béasse, Louis Sallé, Paul Coiffard, Birgen Haest
{"title":"Quantifying nocturnal bird migration using acoustics: opportunities and challenges","authors":"Siméon Béasse, Louis Sallé, Paul Coiffard, Birgen Haest","doi":"10.1002/rse2.433","DOIUrl":"https://doi.org/10.1002/rse2.433","url":null,"abstract":"Acoustic recordings have emerged as a promising tool to monitor nocturnal bird migration, as it can uniquely provide species‐level detection of migratory movements under the darkness of the night sky. This study explores the use of acoustics to quantify nocturnal bird migration across Europe, a region where research on the topic remains relatively sparse. We examine three migration intensity measures derived from acoustic recordings, that is, nocturnal flight call rates, nocturnal flight passage rates and species diversity, in the French Pyrenees in 2021 and 2022. To assess the effectiveness of these acoustic measurements, we compare them with migratory traffic rates estimated by a dedicated bird radar at three taxonomic levels: all birds, passerines and thrushes. We also test if weather conditions influence these relationships and whether combining acoustic data from multiple simultaneous sites improve the predictive performance. Nocturnal flight passage rates, that is, the number of estimated passing birds independent of call abundance, outperformed predictions using species diversity or nocturnal flight call rates. The predictive accuracy of the acoustics data increased with taxonomic detail: predicting thrush migration using acoustics was far more accurate (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 63%) than for passerines (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 29%) or birds in general (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 27%). Prediction using simultaneous acoustics measurements from several sites strongly reduced the uncertainty of the quantification. We did not find any evidence that weather conditions affected the predictive performance of the acoustics data. Accurate, automated monitoring of migratory flows is crucial as many bird species face steep population declines. Acoustic monitoring offers valuable species‐specific insights, making it a powerful tool for nocturnal bird migration studies. This study advances the integration of acoustic methods into bird monitoring by testing their benefits and limitations and provides recommendations and guidelines to enhance the effectiveness of future studies using acoustic data.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"13 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143599604","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信