A deep learning pipeline for time-lapse camera monitoring of insects and their floral environments

IF 5.8 2区 环境科学与生态学 Q1 ECOLOGY
Kim Bjerge , Henrik Karstoft , Hjalte M.R. Mann , Toke T. Høye
{"title":"A deep learning pipeline for time-lapse camera monitoring of insects and their floral environments","authors":"Kim Bjerge ,&nbsp;Henrik Karstoft ,&nbsp;Hjalte M.R. Mann ,&nbsp;Toke T. Høye","doi":"10.1016/j.ecoinf.2024.102861","DOIUrl":null,"url":null,"abstract":"<div><div>Arthropods, including insects, represent the most diverse group and contribute significantly to animal biomass. Automatic monitoring of insects and other arthropods enables quick and efficient observation and management of ecologically and economically important targets such as pollinators, natural enemies, disease vectors, and agricultural pests. The integration of cameras and computer vision facilitates innovative monitoring approaches for agriculture, ecology, entomology, evolution, and biodiversity. However, studying insects and their interactions with flowers and vegetation in natural environments remains challenging, even with automated camera monitoring.</div><div>This paper presents a comprehensive methodology to monitor abundance and diversity of arthropods in the wild and to quantify floral cover as a key resource. We apply the methods across more than 10 million images recorded over two years using 48 insect camera traps placed in three main habitat types. The cameras monitor arthropods, including insect visits, on a specific mix of <em>Sedum</em> plant species with white, yellow and red/pink colored of flowers. The proposed deep-learning pipeline estimates flower cover and detects and classifies arthropod taxa from time-lapse recordings. However, the flower cover serves only as an estimate to correlate insect activity with the flowering plants.Color and semantic segmentation with DeepLabv3 are combined to estimate the percent cover of flowers of different colors. Arthropod detection incorporates motion-informed enhanced images and object detection with You-Only-Look-Once (YOLO), followed by filtering stationary objects to minimize double counting of non-moving animals and erroneous background detections. This filtering approach has been demonstrated to significantly decrease the incidence of false positives, since arthropods, occur in less than 3% of the captured images.</div><div>The final step involves grouping arthropods into 19 taxonomic classes. Seven state-of-the-art models were trained and validated, achieving <span><math><mrow><mi>F</mi><mn>1</mn></mrow></math></span>-scores ranging from 0.81 to 0.89 in classification of arthropods. Among these, the final selected model, EfficientNetB4, achieved an 80% average precision on randomly selected samples when applied to the complete pipeline, which includes detection, filtering, and classification of arthropod images collected in 2021. As expected during the beginning and end of the season, reduced flower cover correlates with a noticeable drop in arthropod detections. The proposed method offers a cost-effective approach to monitoring diverse arthropod taxa and flower cover in natural environments using time-lapse camera recordings.</div></div>","PeriodicalId":51024,"journal":{"name":"Ecological Informatics","volume":null,"pages":null},"PeriodicalIF":5.8000,"publicationDate":"2024-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Ecological Informatics","FirstCategoryId":"93","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1574954124004035","RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ECOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

Arthropods, including insects, represent the most diverse group and contribute significantly to animal biomass. Automatic monitoring of insects and other arthropods enables quick and efficient observation and management of ecologically and economically important targets such as pollinators, natural enemies, disease vectors, and agricultural pests. The integration of cameras and computer vision facilitates innovative monitoring approaches for agriculture, ecology, entomology, evolution, and biodiversity. However, studying insects and their interactions with flowers and vegetation in natural environments remains challenging, even with automated camera monitoring.
This paper presents a comprehensive methodology to monitor abundance and diversity of arthropods in the wild and to quantify floral cover as a key resource. We apply the methods across more than 10 million images recorded over two years using 48 insect camera traps placed in three main habitat types. The cameras monitor arthropods, including insect visits, on a specific mix of Sedum plant species with white, yellow and red/pink colored of flowers. The proposed deep-learning pipeline estimates flower cover and detects and classifies arthropod taxa from time-lapse recordings. However, the flower cover serves only as an estimate to correlate insect activity with the flowering plants.Color and semantic segmentation with DeepLabv3 are combined to estimate the percent cover of flowers of different colors. Arthropod detection incorporates motion-informed enhanced images and object detection with You-Only-Look-Once (YOLO), followed by filtering stationary objects to minimize double counting of non-moving animals and erroneous background detections. This filtering approach has been demonstrated to significantly decrease the incidence of false positives, since arthropods, occur in less than 3% of the captured images.
The final step involves grouping arthropods into 19 taxonomic classes. Seven state-of-the-art models were trained and validated, achieving F1-scores ranging from 0.81 to 0.89 in classification of arthropods. Among these, the final selected model, EfficientNetB4, achieved an 80% average precision on randomly selected samples when applied to the complete pipeline, which includes detection, filtering, and classification of arthropod images collected in 2021. As expected during the beginning and end of the season, reduced flower cover correlates with a noticeable drop in arthropod detections. The proposed method offers a cost-effective approach to monitoring diverse arthropod taxa and flower cover in natural environments using time-lapse camera recordings.
用于昆虫及其花卉环境延时摄影监测的深度学习管道
包括昆虫在内的节肢动物是最多样化的群体,对动物的生物量贡献巨大。通过对昆虫和其他节肢动物进行自动监测,可以对授粉昆虫、天敌、病媒和农业害虫等具有重要生态和经济意义的目标进行快速有效的观察和管理。照相机和计算机视觉的集成为农业、生态学、昆虫学、进化和生物多样性的创新监测方法提供了便利。然而,研究昆虫及其与自然环境中的花卉和植被之间的相互作用仍然具有挑战性,即使是自动相机监测也是如此。本文介绍了一种全面的方法,用于监测野生节肢动物的丰度和多样性,并量化作为关键资源的花卉覆盖率。我们将该方法应用于两年内使用 48 个昆虫相机陷阱记录的 1,000 多万张图像中,这些昆虫相机陷阱被放置在三种主要栖息地类型中。这些相机监测节肢动物,包括昆虫在景天科植物特定品种上的访问,这些品种的花朵颜色有白色、黄色和红色/粉红色。拟议的深度学习管道可估算花朵覆盖率,并从延时记录中检测节肢动物分类群并进行分类。使用 DeepLabv3 进行颜色和语义分割,可估算出不同颜色花朵的覆盖率。节肢动物检测结合了运动信息增强图像和 "只看一次"(YOLO)的物体检测,然后对静止物体进行过滤,以尽量减少对不动动物的重复计算和错误的背景检测。这种过滤方法已被证明能显著降低误报率,因为节肢动物只出现在不到 3% 的捕获图像中。对 7 个最先进的模型进行了训练和验证,节肢动物分类的 F1 分数从 0.81 到 0.89 不等。其中,最终选定的模型 EfficientNetB4 在应用于完整管道(包括 2021 年收集的节肢动物图像的检测、过滤和分类)时,随机选择样本的平均精确度达到了 80%。正如预期的那样,在季节开始和结束时,花卉覆盖率降低与节肢动物检测率明显下降相关。所提出的方法为利用延时摄影机记录监测自然环境中各种节肢动物类群和花卉覆盖率提供了一种经济有效的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Ecological Informatics
Ecological Informatics 环境科学-生态学
CiteScore
8.30
自引率
11.80%
发文量
346
审稿时长
46 days
期刊介绍: The journal Ecological Informatics is devoted to the publication of high quality, peer-reviewed articles on all aspects of computational ecology, data science and biogeography. The scope of the journal takes into account the data-intensive nature of ecology, the growing capacity of information technology to access, harness and leverage complex data as well as the critical need for informing sustainable management in view of global environmental and climate change. The nature of the journal is interdisciplinary at the crossover between ecology and informatics. It focuses on novel concepts and techniques for image- and genome-based monitoring and interpretation, sensor- and multimedia-based data acquisition, internet-based data archiving and sharing, data assimilation, modelling and prediction of ecological data.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信