{"title":"可视化引出知识来完善公民科学技术设计;频谱图与观鸟者产生共鸣","authors":"Jessica L. Oliver, M. Brereton, D. Watson, P. Roe","doi":"10.1145/3292147.3292171","DOIUrl":null,"url":null,"abstract":"Acoustic sensors offer a promising new tool to detect furtive animals; however, sifting through years of audio data is fraught with challenges. Developing automatic detection software still requires a large dataset of calls that have been accurately annotated by experts. Few studies have explored how people identify species by vocalisations in the wild, and how this skill can be applied to designing technologies for locating and identifying calls in recordings. To explore how birders often find and identify animals by calls and share their observations, we conducted qualitative interviews and a visualization-review activity with nine birders, eliciting insight into their existing practices, knowledge, and visualisation interpretation. We found that visualisations evoked memories demonstrating birder expertise on the natural history, behaviours, and habitats of birds. Birders were curious and learned from exploring the abstract patterns in visualisations of acoustic data, relying on past experiences with nature to interpret acoustic visualisations. Birders often wanted to corroborate findings with other birders by reviewing acoustic recordings and local bird lists. This study demonstrates how qualitative review of visualisations can elicit a nuanced understanding of community practices, knowledge, and sensemaking, which are essential to improve design of future technologies.","PeriodicalId":309502,"journal":{"name":"Proceedings of the 30th Australian Conference on Computer-Human Interaction","volume":"117 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Visualisations elicit knowledge to refine citizen science technology design: spectrograms resonate with birders\",\"authors\":\"Jessica L. Oliver, M. Brereton, D. Watson, P. Roe\",\"doi\":\"10.1145/3292147.3292171\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Acoustic sensors offer a promising new tool to detect furtive animals; however, sifting through years of audio data is fraught with challenges. Developing automatic detection software still requires a large dataset of calls that have been accurately annotated by experts. Few studies have explored how people identify species by vocalisations in the wild, and how this skill can be applied to designing technologies for locating and identifying calls in recordings. To explore how birders often find and identify animals by calls and share their observations, we conducted qualitative interviews and a visualization-review activity with nine birders, eliciting insight into their existing practices, knowledge, and visualisation interpretation. We found that visualisations evoked memories demonstrating birder expertise on the natural history, behaviours, and habitats of birds. Birders were curious and learned from exploring the abstract patterns in visualisations of acoustic data, relying on past experiences with nature to interpret acoustic visualisations. Birders often wanted to corroborate findings with other birders by reviewing acoustic recordings and local bird lists. This study demonstrates how qualitative review of visualisations can elicit a nuanced understanding of community practices, knowledge, and sensemaking, which are essential to improve design of future technologies.\",\"PeriodicalId\":309502,\"journal\":{\"name\":\"Proceedings of the 30th Australian Conference on Computer-Human Interaction\",\"volume\":\"117 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-12-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 30th Australian Conference on Computer-Human Interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3292147.3292171\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 30th Australian Conference on Computer-Human Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3292147.3292171","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Visualisations elicit knowledge to refine citizen science technology design: spectrograms resonate with birders
Acoustic sensors offer a promising new tool to detect furtive animals; however, sifting through years of audio data is fraught with challenges. Developing automatic detection software still requires a large dataset of calls that have been accurately annotated by experts. Few studies have explored how people identify species by vocalisations in the wild, and how this skill can be applied to designing technologies for locating and identifying calls in recordings. To explore how birders often find and identify animals by calls and share their observations, we conducted qualitative interviews and a visualization-review activity with nine birders, eliciting insight into their existing practices, knowledge, and visualisation interpretation. We found that visualisations evoked memories demonstrating birder expertise on the natural history, behaviours, and habitats of birds. Birders were curious and learned from exploring the abstract patterns in visualisations of acoustic data, relying on past experiences with nature to interpret acoustic visualisations. Birders often wanted to corroborate findings with other birders by reviewing acoustic recordings and local bird lists. This study demonstrates how qualitative review of visualisations can elicit a nuanced understanding of community practices, knowledge, and sensemaking, which are essential to improve design of future technologies.