Mimi Arandjelovic, Colleen R. Stephens, Paula Dieguez, Nuria Maldonado, Gaëlle Bocksberger, Marie‐Lyne Després‐Einspenner, Benjamin Debetencourt, Vittoria Estienne, Ammie K. Kalan, Maureen S. McCarthy, Anne‐Céline Granjon, Veronika Städele, Briana Harder, Lucia Hacker, Anja Landsmann, Laura K. Lynn, Heidi Pfund, Zuzana Ročkaiová, Kristeena Sigler, Jane Widness, Heike Wilken, Antonio Buzharevski, Adeelia S. Goffe, Kristin Havercamp, Lydia L. Luncz, Giulia Sirianni, Erin G. Wessling, Roman M. Wittig, Christophe Boesch, Hjalmar S. Kühl
{"title":"Highly precise community science annotations of video camera‐trapped fauna in challenging environments","authors":"Mimi Arandjelovic, Colleen R. Stephens, Paula Dieguez, Nuria Maldonado, Gaëlle Bocksberger, Marie‐Lyne Després‐Einspenner, Benjamin Debetencourt, Vittoria Estienne, Ammie K. Kalan, Maureen S. McCarthy, Anne‐Céline Granjon, Veronika Städele, Briana Harder, Lucia Hacker, Anja Landsmann, Laura K. Lynn, Heidi Pfund, Zuzana Ročkaiová, Kristeena Sigler, Jane Widness, Heike Wilken, Antonio Buzharevski, Adeelia S. Goffe, Kristin Havercamp, Lydia L. Luncz, Giulia Sirianni, Erin G. Wessling, Roman M. Wittig, Christophe Boesch, Hjalmar S. Kühl","doi":"10.1002/rse2.402","DOIUrl":null,"url":null,"abstract":"As camera trapping grows in popularity and application, some analytical limitations persist including processing time and accuracy of data annotation. Typically images are recorded by camera traps although videos are becoming increasingly collected even though they require much more time for annotation. To overcome limitations with image annotation, camera trap studies are increasingly linked to community science (CS) platforms. Here, we extend previous work on CS image annotations to camera trap videos from a challenging environment; a dense tropical forest with low visibility and high occlusion due to thick canopy cover and bushy undergrowth at the camera level. Using the CS platform Chimp&See, established for classification of 599 956 video clips from tropical Africa, we assess annotation precision and accuracy by comparing classification of 13 531 1‐min video clips by a professional ecologist (PE) with output from 1744 registered, as well as unregistered, Chimp&See community scientists. We considered 29 classification categories, including 17 species and 12 higher‐level categories, in which phenotypically similar species were grouped. Overall, annotation precision was 95.4%, which increased to 98.2% when aggregating similar species groups together. Our findings demonstrate the competence of community scientists working with camera trap videos from even challenging environments and hold great promise for future studies on animal behaviour, species interaction dynamics and population monitoring.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":"17 1","pages":""},"PeriodicalIF":3.9000,"publicationDate":"2024-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Remote Sensing in Ecology and Conservation","FirstCategoryId":"93","ListUrlMain":"https://doi.org/10.1002/rse2.402","RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ECOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
As camera trapping grows in popularity and application, some analytical limitations persist including processing time and accuracy of data annotation. Typically images are recorded by camera traps although videos are becoming increasingly collected even though they require much more time for annotation. To overcome limitations with image annotation, camera trap studies are increasingly linked to community science (CS) platforms. Here, we extend previous work on CS image annotations to camera trap videos from a challenging environment; a dense tropical forest with low visibility and high occlusion due to thick canopy cover and bushy undergrowth at the camera level. Using the CS platform Chimp&See, established for classification of 599 956 video clips from tropical Africa, we assess annotation precision and accuracy by comparing classification of 13 531 1‐min video clips by a professional ecologist (PE) with output from 1744 registered, as well as unregistered, Chimp&See community scientists. We considered 29 classification categories, including 17 species and 12 higher‐level categories, in which phenotypically similar species were grouped. Overall, annotation precision was 95.4%, which increased to 98.2% when aggregating similar species groups together. Our findings demonstrate the competence of community scientists working with camera trap videos from even challenging environments and hold great promise for future studies on animal behaviour, species interaction dynamics and population monitoring.
期刊介绍:
emote Sensing in Ecology and Conservation provides a forum for rapid, peer-reviewed publication of novel, multidisciplinary research at the interface between remote sensing science and ecology and conservation. The journal prioritizes findings that advance the scientific basis of ecology and conservation, promoting the development of remote-sensing based methods relevant to the management of land use and biological systems at all levels, from populations and species to ecosystems and biomes. The journal defines remote sensing in its broadest sense, including data acquisition by hand-held and fixed ground-based sensors, such as camera traps and acoustic recorders, and sensors on airplanes and satellites. The intended journal’s audience includes ecologists, conservation scientists, policy makers, managers of terrestrial and aquatic systems, remote sensing scientists, and students.
Remote Sensing in Ecology and Conservation is a fully open access journal from Wiley and the Zoological Society of London. Remote sensing has enormous potential as to provide information on the state of, and pressures on, biological diversity and ecosystem services, at multiple spatial and temporal scales. This new publication provides a forum for multidisciplinary research in remote sensing science, ecological research and conservation science.