Arthur Paté, Gaspard Farge, Benjamin K. Holtzman, Anna C. Barth, Piero Poli, Lapo Boschi, Leif Karlstrom
{"title":"结合声音和视觉显示,突出时间和空间的地震模式","authors":"Arthur Paté, Gaspard Farge, Benjamin K. Holtzman, Anna C. Barth, Piero Poli, Lapo Boschi, Leif Karlstrom","doi":"10.1007/s12193-021-00378-8","DOIUrl":null,"url":null,"abstract":"<p>Data visualization, and to a lesser extent data sonification, are classic tools to the scientific community. However, these two approaches are very rarely combined, although they are highly complementary: our visual system is good at recognizing spatial patterns, whereas our auditory system is better tuned for temporal patterns. In this article, data representation methods are proposed that combine visualization, sonification, and spatial audio techniques, in order to optimize the user’s perception of spatial and temporal patterns in a single display, to increase the feeling of immersion, and to take advantage of multimodal integration mechanisms. Three seismic data sets are used to illustrate the methods, covering different physical phenomena, time scales, spatial distributions, and spatio-temporal dynamics. The methods are adapted to the specificities of each data set, and to the amount of information that the designer wants to display. This leads to further developments, namely the use of audification with two time scales, the switch from pure audification to time-modulated noise, and the switch from pure audification to sonic icons. First user feedback from live demonstrations indicates that the methods presented in this article seem to enhance the perception of spatio-temporal patterns, which is a key parameter to the understanding of seismically active systems, and a step towards apprehending the processes that drive this activity.\n</p>","PeriodicalId":17529,"journal":{"name":"Journal on Multimodal User Interfaces","volume":"45 1","pages":""},"PeriodicalIF":2.2000,"publicationDate":"2021-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Combining audio and visual displays to highlight temporal and spatial seismic patterns\",\"authors\":\"Arthur Paté, Gaspard Farge, Benjamin K. Holtzman, Anna C. Barth, Piero Poli, Lapo Boschi, Leif Karlstrom\",\"doi\":\"10.1007/s12193-021-00378-8\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Data visualization, and to a lesser extent data sonification, are classic tools to the scientific community. However, these two approaches are very rarely combined, although they are highly complementary: our visual system is good at recognizing spatial patterns, whereas our auditory system is better tuned for temporal patterns. In this article, data representation methods are proposed that combine visualization, sonification, and spatial audio techniques, in order to optimize the user’s perception of spatial and temporal patterns in a single display, to increase the feeling of immersion, and to take advantage of multimodal integration mechanisms. Three seismic data sets are used to illustrate the methods, covering different physical phenomena, time scales, spatial distributions, and spatio-temporal dynamics. The methods are adapted to the specificities of each data set, and to the amount of information that the designer wants to display. This leads to further developments, namely the use of audification with two time scales, the switch from pure audification to time-modulated noise, and the switch from pure audification to sonic icons. First user feedback from live demonstrations indicates that the methods presented in this article seem to enhance the perception of spatio-temporal patterns, which is a key parameter to the understanding of seismically active systems, and a step towards apprehending the processes that drive this activity.\\n</p>\",\"PeriodicalId\":17529,\"journal\":{\"name\":\"Journal on Multimodal User Interfaces\",\"volume\":\"45 1\",\"pages\":\"\"},\"PeriodicalIF\":2.2000,\"publicationDate\":\"2021-07-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal on Multimodal User Interfaces\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s12193-021-00378-8\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal on Multimodal User Interfaces","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s12193-021-00378-8","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Combining audio and visual displays to highlight temporal and spatial seismic patterns
Data visualization, and to a lesser extent data sonification, are classic tools to the scientific community. However, these two approaches are very rarely combined, although they are highly complementary: our visual system is good at recognizing spatial patterns, whereas our auditory system is better tuned for temporal patterns. In this article, data representation methods are proposed that combine visualization, sonification, and spatial audio techniques, in order to optimize the user’s perception of spatial and temporal patterns in a single display, to increase the feeling of immersion, and to take advantage of multimodal integration mechanisms. Three seismic data sets are used to illustrate the methods, covering different physical phenomena, time scales, spatial distributions, and spatio-temporal dynamics. The methods are adapted to the specificities of each data set, and to the amount of information that the designer wants to display. This leads to further developments, namely the use of audification with two time scales, the switch from pure audification to time-modulated noise, and the switch from pure audification to sonic icons. First user feedback from live demonstrations indicates that the methods presented in this article seem to enhance the perception of spatio-temporal patterns, which is a key parameter to the understanding of seismically active systems, and a step towards apprehending the processes that drive this activity.
期刊介绍:
The Journal of Multimodal User Interfaces publishes work in the design, implementation and evaluation of multimodal interfaces. Research in the domain of multimodal interaction is by its very essence a multidisciplinary area involving several fields including signal processing, human-machine interaction, computer science, cognitive science and ergonomics. This journal focuses on multimodal interfaces involving advanced modalities, several modalities and their fusion, user-centric design, usability and architectural considerations. Use cases and descriptions of specific application areas are welcome including for example e-learning, assistance, serious games, affective and social computing, interaction with avatars and robots.