Remote Sensing in Ecology and Conservation最新文献

筛选
英文 中文
Eigenfeature‐enhanced deep learning: advancing tree species classification in mixed conifer forests with lidar 特征增强深度学习:利用激光雷达推进混交林树种分类
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-06-09 DOI: 10.1002/rse2.70014
Ryan C. Blackburn, Robert Buscaglia, Andrew J. Sánchez Meador, Margaret M. Moore, Temuulen Sankey, Steven E. Sesnie
{"title":"Eigenfeature‐enhanced deep learning: advancing tree species classification in mixed conifer forests with lidar","authors":"Ryan C. Blackburn, Robert Buscaglia, Andrew J. Sánchez Meador, Margaret M. Moore, Temuulen Sankey, Steven E. Sesnie","doi":"10.1002/rse2.70014","DOIUrl":"https://doi.org/10.1002/rse2.70014","url":null,"abstract":"Accurately classifying tree species using remotely sensed data remains a significant challenge, yet it is essential for forest monitoring and understanding ecosystem dynamics over large spatial extents. While light detection and ranging (lidar) has shown promise for species classification, its accuracy typically decreases in complex forests or with lower lidar point densities. Recent advancements in lidar processing and machine learning offer new opportunities to leverage previously unavailable structural information. In this study, we present an automated machine learning pipeline that reduces practitioner burden by utilizing canonical deep learning and improved input layers through the derivation of eigenfeatures. These eigenfeatures were used as inputs for a 2D convolutional neural network (CNN) to classify seven tree species in the Mogollon Rim Ranger District of the Coconino National Forest, AZ, US. We compared eigenfeature images derived from unoccupied aerial vehicle laser scanning (UAV‐LS) and airborne laser scanning (ALS) individual tree segmentation algorithms against raw intensity and colorless control images. Remarkably, mean overall accuracies for classifying seven species reached 94.8% for ALS and 93.4% for UAV‐LS. White image types underperformed for both ALS and UAV‐LS compared to eigenfeature images, while ALS and UAV‐LS image types showed marginal differences in model performance. These results demonstrate that lower point density ALS data can achieve high classification accuracy when paired with eigenfeatures in an automated pipeline. This study advances the field by addressing species classification at scales ranging from individual trees to landscapes, offering a scalable and efficient approach for understanding tree composition in complex forests.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"47 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144252281","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Hyperspectral imagery, LiDAR point clouds, and environmental DNA to assess land‐water linkage of biodiversity across aquatic functional feeding groups 利用高光谱图像、激光雷达点云和环境DNA评估水生功能性摄食群体的陆地-水生物多样性联系
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-06-02 DOI: 10.1002/rse2.70010
Heng Zhang, Carmen Meiller, Andreas Hueni, Rosetta C. Blackman, Felix Morsdorf, Isabelle S. Helfenstein, Michael E. Schaepman, Florian Altermatt
{"title":"Hyperspectral imagery, LiDAR point clouds, and environmental DNA to assess land‐water linkage of biodiversity across aquatic functional feeding groups","authors":"Heng Zhang, Carmen Meiller, Andreas Hueni, Rosetta C. Blackman, Felix Morsdorf, Isabelle S. Helfenstein, Michael E. Schaepman, Florian Altermatt","doi":"10.1002/rse2.70010","DOIUrl":"https://doi.org/10.1002/rse2.70010","url":null,"abstract":"Different organismal functional feeding groups (FFGs) are key components of aquatic food webs and are important for sustaining ecosystem functioning in riverine ecosystems. Their distribution and diversity are tightly associated with the surrounding terrestrial landscape through land‐water linkages. Nevertheless, knowledge about the spatial extent and magnitude of these cross‐ecosystem linkages within major FFGs still remains unclear. Here, we conducted an airborne imaging spectroscopy campaign and a systematic environmental DNA (eDNA) field sampling of river water in a 740‐km<jats:sup>2</jats:sup> mountainous catchment, combined with light detection and ranging (LiDAR) point clouds, to obtain the spectral and morphological diversity of the terrestrial landscape and the diversity of major FFGs in rivers. We identified the scale of these linkages, ranging from a few hundred meters to more than 10 km, with collectors and filterers, shredders, and small invertebrate predators having local‐scale associations, while invertebrate‐eating fish, grazers, and scrapers have more landscape‐scale associations. Among all major FFGs, shredders, grazers, and scrapers in the streams had the strongest association with surrounding terrestrial vegetation. Our research reveals the reference spatial scales at which major FFGs are linked to the surrounding terrestrial landscape, providing spatially explicit evidence of the cross‐ecosystem linkages needed for conservation design and management.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"25 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144192865","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Hyperspectral imaging has a limited ability to remotely sense the onset of beech bark disease 高光谱成像对山毛榉树皮疾病的远程感知能力有限
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-05-30 DOI: 10.1002/rse2.70013
Guillaume Tougas, Christine I. B. Wallis, Etienne Laliberté, Mark Vellend
{"title":"Hyperspectral imaging has a limited ability to remotely sense the onset of beech bark disease","authors":"Guillaume Tougas, Christine I. B. Wallis, Etienne Laliberté, Mark Vellend","doi":"10.1002/rse2.70013","DOIUrl":"https://doi.org/10.1002/rse2.70013","url":null,"abstract":"Insect and pathogen outbreaks have a major impact on northern forest ecosystems. Even for pathogens that have been present in a region for decades, such as beech bark disease (BBD), new waves of tree mortality are expected. Hence, there is a need for innovative approaches to monitor disease advancement in real time. Here, we test whether airborne hyperspectral imaging – involving data from 344 wavelengths in the visible, near infrared (NIR) and short‐wave infrared (SWIR) – can be used to assess beech bark disease severity in southern Quebec, Canada. Field data on disease severity were linked to airborne hyperspectral data for individual beech crowns. Partial least‐squares regression (PLSR) models using airborne imaging spectroscopy data predicted a small proportion of the variance in beech bark disease severity: the best model had an <jats:italic>R</jats:italic><jats:sup>2</jats:sup> of only 0.09. Wavelengths with the strongest contributions were from the red‐edge region (~715 nm) and the SWIR (~1287 nm), which may suggest mediation by canopy greenness, water content, and canopy architecture. Similar models using hyperspectral data taken directly on individual leaves had no explanatory power (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0). In addition, airborne and leaf‐level hyperspectral datasets were uncorrelated. The failure of leaf‐level models suggests that canopy structure was likely responsible for the limited predictive ability of the airborne model. Somewhat better performance in predicting disease severity was found using common band ratios for canopy greenness assessment (e.g., the Green Normalized Difference Vegetation Index, gNDVI, and the Normalized Phaeophytinization Index, NPQI); these variables explained up to 19% of the variation in disease severity. Overall, we argue that the complexity of hyperspectral data is not necessary for assessing BBD spread and that spectral data in general may not provide an efficient means of improving BBD monitoring on a larger scale.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"41 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-05-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144183762","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Increasing citizen scientist accuracy with artificial intelligence on UK camera‐trap data 提高公民科学家对英国相机陷阱数据的人工智能准确性
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-05-19 DOI: 10.1002/rse2.70012
C. R. Sharpe, R. A. Hill, H. M. Chappell, S. E. Green, K. Holden, P. Fergus, C. Chalmers, P. A. Stephens
{"title":"Increasing citizen scientist accuracy with artificial intelligence on UK camera‐trap data","authors":"C. R. Sharpe, R. A. Hill, H. M. Chappell, S. E. Green, K. Holden, P. Fergus, C. Chalmers, P. A. Stephens","doi":"10.1002/rse2.70012","DOIUrl":"https://doi.org/10.1002/rse2.70012","url":null,"abstract":"As camera traps have become more widely used, extracting information from images at the pace they are acquired has become challenging, resulting in backlogs that delay the communication of results and the use of data for conservation and management. To ameliorate this, artificial intelligence (AI), crowdsourcing to citizen scientists and combined approaches have surfaced as solutions. Using data from the UK mammal monitoring initiative MammalWeb, we assess the accuracies of classifications from registered citizen scientists, anonymous participants and a convolutional neural network (CNN). The engagement of anonymous volunteers was facilitated by the strategic placement of MammalWeb interfaces in a natural history museum with high footfall related to the ‘Dippy on Tour’ exhibition. The accuracy of anonymous volunteer classifications gathered through public interfaces has not been reported previously, and here we consider this form of citizen science in the context of alternative forms of data acquisition. While AI models have performed well at species identification in bespoke settings, here we report model performance on a dataset for which the model in question was not explicitly trained. We also consider combining AI output with that of human volunteers to demonstrate combined workflows that produce high accuracy predictions. We find the consensus of registered users has greater overall accuracy (97%) than the consensus from anonymous contributors (71%); AI accuracy lies in between (78%). A combined approach between registered citizen scientists and AI output provides an overall accuracy of 96%. Further, when the contributions of anonymous citizen scientists are concordant with AI output, 98% accuracy can be achieved. The generality of this last finding merits further investigation, given the potential to gather classifications much more rapidly if public displays are placed in areas of high footfall. We suggest that combined approaches to image classification are optimal when the minimisation of classification errors is desired.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"18 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144097312","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Night lights from space: potential of SDGSAT‐1 for ecological applications 太空夜灯:SDGSAT - 1在生态应用中的潜力
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-05-16 DOI: 10.1002/rse2.70011
Dominique Weber, Janine Bolliger, Klaus Ecker, Claude Fischer, Christian Ginzler, Martin M. Gossner, Laurent Huber, Martin K. Obrist, Florian Zellweger, Noam Levin
{"title":"Night lights from space: potential of SDGSAT‐1 for ecological applications","authors":"Dominique Weber, Janine Bolliger, Klaus Ecker, Claude Fischer, Christian Ginzler, Martin M. Gossner, Laurent Huber, Martin K. Obrist, Florian Zellweger, Noam Levin","doi":"10.1002/rse2.70011","DOIUrl":"https://doi.org/10.1002/rse2.70011","url":null,"abstract":"Light pollution affects biodiversity at all levels, from genes to ecosystems, and improved monitoring and research is needed to better assess its various ecological impacts. Here, we review the current contribution of night‐time satellites to ecological applications and elaborate on the potential value of the Glimmer sensor onboard the Chinese Sustainable Development Goals Science Satellite 1 (SDGSAT‐1), a novel medium‐resolution and multispectral sensor, for quantifying artificial light at night (ALAN). Due to their coarse spatial, spectral or temporal resolution, most of the currently used space‐borne sensors are limited in their contribution to assessments of light pollution at multiple scales and of the ecological and conservation‐relevant effects of ALAN. SDGSAT‐1 now offers new opportunities to map the variability in light intensity and spectra at finer spatial resolution, providing the means to disentangle and characterize different sources of ALAN, and to relate ALAN to local environmental parameters, in situ measurements and surveys. Monitoring direct light emissions at 10–40 m spatial resolution enables scientists to better understand the origins and impacts of light pollution on sensitive species and ecosystems, and assists practitioners in implementing local conservation measures. We demonstrate some key ecological applications of SDGSAT‐1, such as quantifying the exposure of protected areas to light pollution, assessing wildlife corridors and dark refuges in urban areas, and modelling the visibility of light sources to animals. We conclude that SDGSAT‐1, and possibly similar future satellite missions, will significantly advance ecological light pollution research to better understand the environmental impacts of light pollution and to devise strategies to mitigate them.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"54 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144066914","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A scalable transfer learning workflow for extracting biological and behavioural insights from forest elephant vocalizations 一个可扩展的迁移学习工作流,用于从森林象的发声中提取生物学和行为学见解
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-04-25 DOI: 10.1002/rse2.70008
Alastair Pickering, Santiago Martinez Balvanera, Kate E. Jones, Daniela Hedwig
{"title":"A scalable transfer learning workflow for extracting biological and behavioural insights from forest elephant vocalizations","authors":"Alastair Pickering, Santiago Martinez Balvanera, Kate E. Jones, Daniela Hedwig","doi":"10.1002/rse2.70008","DOIUrl":"https://doi.org/10.1002/rse2.70008","url":null,"abstract":"Animal vocalizations encode rich biological information—such as age, sex, behavioural context and emotional state—making bioacoustic analysis a promising non‐invasive method for assessing welfare and population demography. However, traditional bioacoustic approaches, which rely on manually defined acoustic features, are time‐consuming, require specialized expertise and may introduce subjective bias. These constraints reduce the feasibility of analysing increasingly large datasets generated by passive acoustic monitoring (PAM). Transfer learning with Convolutional Neural Networks (CNNs) offers a scalable alternative by enabling automatic acoustic feature extraction without predefined criteria. Here, we applied four pre‐trained CNNs—two general purpose models (VGGish and YAMNet) and two avian bioacoustic models (Perch and BirdNET)—to African forest elephant (<jats:italic>Loxodonta cyclotis</jats:italic>) recordings. We used a dimensionality reduction algorithm (UMAP) to represent the extracted acoustic features in two dimensions and evaluated these representations across three key tasks: (1) call‐type classification (rumble, roar and trumpet), (2) rumble sub‐type identification and (3) behavioural and demographic analysis. A Random Forest classifier trained on these features achieved near‐perfect accuracy for rumbles, with Perch attaining the highest average accuracy (0.85) across all call types. Clustering the reduced features identified biologically meaningful rumble sub‐types—such as adult female calls linked to logistics—and provided clearer groupings than manual classification. Statistical analyses showed that factors including age and behavioural context significantly influenced call variation (<jats:italic>P</jats:italic> &lt; 0.001), with additional comparisons revealing clear differences among contexts (e.g. nursing, competition, separation), sexes and multiple age classes. Perch and BirdNET consistently outperformed general purpose models when dealing with complex or ambiguous calls. These findings demonstrate that transfer learning enables scalable, reproducible bioacoustic workflows capable of detecting biologically meaningful acoustic variation. Integrating this approach into PAM pipelines can enhance the non‐invasive assessment of population dynamics, behaviour and welfare in acoustically active species.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"219 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143875850","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Advancing the mapping of vegetation structure in savannas using Sentinel‐1 imagery 利用哨兵-1 图像推进绘制热带草原植被结构图的工作
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-04-22 DOI: 10.1002/rse2.70006
Vera Thijssen, Marianthi Tangili, Ruth A. Howison, Han Olff
{"title":"Advancing the mapping of vegetation structure in savannas using Sentinel‐1 imagery","authors":"Vera Thijssen, Marianthi Tangili, Ruth A. Howison, Han Olff","doi":"10.1002/rse2.70006","DOIUrl":"https://doi.org/10.1002/rse2.70006","url":null,"abstract":"Vegetation structure monitoring is important for the understanding and conservation of savanna ecosystems. Optical satellite imagery can be used to estimate canopy cover, but provides limited information about the structure of savannas, and is restricted to daytime and clear‐sky captures. Active remote sensing can potentially overcome this. We explore the utility of C‐band synthetic aperture radar imagery for mapping both grassland and woody vegetation structure in savannas. We calibrated Sentinel‐1 VH () and VV () backscatter coefficients and their ratio () to ground‐based estimates of grass biomass, woody canopy volume (&lt;50 000 m<jats:sup>3</jats:sup>/ha) and tree basal area (&lt;15 m<jats:sup>2</jats:sup>/ha) in the Greater Serengeti‐Mara Ecosystem, and simultaneously explored their sensitivity to soil moisture. We show that in particular can be used to estimate grass biomass (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0.54, RMSE = 630 kg/ha, %range = 20.6), woody canopy volume (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0.69, RMSE = 4188 m<jats:sup>3</jats:sup>/ha, %range = 11.8) and tree basal area (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0.44, RMSE = 2.03 m<jats:sup>2</jats:sup>/ha, %range = 18.6) in the dry season, allowing for the extrapolation to regional scale vegetation structure maps. We also introduce new proxies for soil moisture as an option for extending this approach to the wet season using the 90‐day preceding bounded running averages of the Climate Hazards Group InfraRed Precipitation with Station (CHIRPS) and the Multi‐satellitE Retrievals for Global Precipitation Measurement (IMERG) datasets. We discuss the potential of Sentinel‐1 imagery for better understanding of the spatio‐temporal dynamics of vegetation structure in savannas.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"91 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143862136","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Object detection‐assisted workflow facilitates cryptic snake monitoring 物体探测辅助工作流程有助于监测隐蛇
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-04-21 DOI: 10.1002/rse2.70009
Storm Miller, Michael Kirkland, Kristen M. Hart, Robert A. McCleery
{"title":"Object detection‐assisted workflow facilitates cryptic snake monitoring","authors":"Storm Miller, Michael Kirkland, Kristen M. Hart, Robert A. McCleery","doi":"10.1002/rse2.70009","DOIUrl":"https://doi.org/10.1002/rse2.70009","url":null,"abstract":"Camera traps are an important tool used to study rare and cryptic animals, including snakes. Time‐lapse photography can be particularly useful for studying snakes that often fail to trigger a camera's infrared motion sensor due to their ectothermic nature. However, the large datasets produced by time‐lapse photography require labor‐intensive classification, limiting their use in large‐scale studies. While many artificial intelligence‐based object detection models are effective at identifying mammals in images, their ability to detect snakes is unproven. Here, we used camera data to evaluate the efficacy of an object detection model to rapidly and accurately detect snakes. We classified images manually to the species level and compared this with a hybrid review workflow where the model removed blank images followed by a manual review. Using a ≥0.05 model confidence threshold, our hybrid review workflow correctly identified 94.5% of blank images, completed image classification 6× faster, and detected large (&gt;66 cm) snakes as well as manual review. Conversely, the hybrid review method often failed to detect all instances of a snake in a string of images and detected fewer small (&lt;66 cm) snakes than manual review. However, most relevant ecological information requires only a single detection in a sequence of images, and study design changes could likely improve the detection of smaller snakes. Our findings suggest that an object detection‐assisted hybrid workflow can greatly reduce time spent manually classifying data‐heavy time‐lapse snake studies and facilitate ecological monitoring for large snakes.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"68 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143853540","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Towards edge processing of images from insect camera traps 昆虫相机陷阱图像的边缘处理
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-04-17 DOI: 10.1002/rse2.70007
Kim Bjerge, Henrik Karstoft, Toke T. Høye
{"title":"Towards edge processing of images from insect camera traps","authors":"Kim Bjerge, Henrik Karstoft, Toke T. Høye","doi":"10.1002/rse2.70007","DOIUrl":"https://doi.org/10.1002/rse2.70007","url":null,"abstract":"Insects represent nearly half of all known multicellular species, but knowledge about them lags behind for most vertebrate species. In part for this reason, they are often neglected in biodiversity conservation policies and practice. Computer vision tools, such as insect camera traps, for automated monitoring have the potential to revolutionize insect study and conservation. To further advance insect camera trapping and the analysis of their image data, effective image processing pipelines are needed. In this paper, we present a flexible and fast processing pipeline designed to analyse these recordings by detecting, tracking and classifying nocturnal insects in a broad taxonomy of 15 insect classes and resolution of individual moth species. A classifier with anomaly detection is proposed to filter dark, blurred or partially visible insects that will be uncertain to classify correctly. A simple track‐by‐detection algorithm is proposed to track classified insects by incorporating feature embeddings, distance and area cost. We evaluated the computational speed and power performance of different edge computing devices (Raspberry Pi's and NVIDIA Jetson Nano) and compared various time‐lapse (TL) strategies with tracking. The minimum difference of detections was found for 2‐min TL intervals compared to tracking with 0.5 frames per second; however, for insects with fewer than one detection per night, the Pearson correlation decreases. Shifting from tracking to TL monitoring would reduce the number of recorded images and would allow for edge processing of images in real‐time on a camera trap with Raspberry Pi. The Jetson Nano is the most energy‐efficient solution, capable of real‐time tracking at nearly 0.5 fps. Our processing pipeline was applied to more than 5.7 million images recorded at 0.5 frames per second from 12 light camera traps during two full seasons located in diverse habitats, including bogs, heaths and forests. Our results thus show the scalability of insect camera traps.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"123 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-04-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143847243","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Application of computer vision for off‐highway vehicle route detection: A case study in Mojave desert tortoise habitat 计算机视觉在非公路车辆路线检测中的应用——以莫哈韦沙漠陆龟栖息地为例
IF 5.5 2区 环境科学与生态学
Remote Sensing in Ecology and Conservation Pub Date : 2025-04-07 DOI: 10.1002/rse2.70004
Alexander J. Robillard, Madeline Standen, Noah Giebink, Mark Spangler, Amy C. Collins, Brian Folt, Andrew Maguire, Elissa M. Olimpi, Brett G. Dickson
{"title":"Application of computer vision for off‐highway vehicle route detection: A case study in Mojave desert tortoise habitat","authors":"Alexander J. Robillard, Madeline Standen, Noah Giebink, Mark Spangler, Amy C. Collins, Brian Folt, Andrew Maguire, Elissa M. Olimpi, Brett G. Dickson","doi":"10.1002/rse2.70004","DOIUrl":"https://doi.org/10.1002/rse2.70004","url":null,"abstract":"Driving off‐highway vehicles (OHVs), which contributes to habitat degradation and fragmentation, is a common recreational activity in the United States and other parts of the world, particularly in desert environments with fragile ecosystems. Although habitat degradation and mortality from the expansion of OHV networks are thought to have major impacts on desert species, comprehensive maps of OHV route networks and their changes are poorly understood. To better understand how OHV route networks have evolved in the Mojave Desert ecoregion, we developed a computer vision approach to estimate OHV route location and density across the range of the Mojave desert tortoise (<jats:italic>Gopherus agassizii</jats:italic>). We defined OHV routes as non‐paved, linear features, including designated routes and washes in the presence of non‐paved routes. Using contemporary (<jats:italic>n</jats:italic> = 1499) and historical (<jats:italic>n</jats:italic> = 1148) aerial images, we trained and validated three convolutional neural network (CNN) models. We cross‐examined each model on sets of independently curated data and selected the highest performing model to generate predictions across the tortoise's range. When evaluated against a ‘hybrid’ test set (<jats:italic>n</jats:italic> = 1807 images), the final hybrid model achieved an accuracy of 77%. We then applied our model to remotely sensed imagery from across the tortoise's range and generated spatial layers of OHV route density for the 1970s, 1980s, 2010s, and 2020s. We examined OHV route density within tortoise conservation areas (TCA) and recovery units (RU) within the range of the species. Results showed an increase in the OHV route density in both TCAs (8.45%) and RUs (7.85%) from 1980 to 2020. Ordinal logistic regression indicated a strong correlation (OR = 1.01, <jats:italic>P</jats:italic> &lt; 0.001) between model outputs and ground‐truthed OHV maps from the study region. Our computer vision approach and mapped results can inform conservation strategies and management aimed at mitigating the adverse impacts of OHV activity on sensitive ecosystems.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"89 1","pages":""},"PeriodicalIF":5.5,"publicationDate":"2025-04-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143798030","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信