D E Macdonald, T. Natarajan, Richard C. Windeyer, P. Coppin, D. Steinman
{"title":"Data-Driven Sonification of CFD Aneurysm Models","authors":"D E Macdonald, T. Natarajan, Richard C. Windeyer, P. Coppin, D. Steinman","doi":"10.21785/ICAD2018.010","DOIUrl":"https://doi.org/10.21785/ICAD2018.010","url":null,"abstract":"A novel method is presented for inspecting and characterizing turbulent-like hemodynamic structures in intracranial cerebral aneurysms by sonification of data generated using Computational Fluid Dynamics (CFD). The intention of the current research is to intuitively communicate flow complexity by augmenting conventional flow visualizations with data-driven sound, thereby increasing the ease of interpretation of dense spatiotemporal data through multimodal presentation. The described implementation allows the user to listen to flow fluctuations thought to indicate turbulent-like blood flow patterns that are often visually difficult to discriminate in conventional flow visualizations.","PeriodicalId":402143,"journal":{"name":"Proceedings of the 24th International Conference on Auditory Display - ICAD 2018","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122862002","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Seeking a Reference Frame for Cartographic Sonification","authors":"Megen E. Brittell","doi":"10.21785/ICAD2018.020","DOIUrl":"https://doi.org/10.21785/ICAD2018.020","url":null,"abstract":"Sonification of geospatial data must situate data values in two (or three) dimensional space. The need to position data values in space distinguishes geospatial data from other multi-dimensional data sets. While cartographers have extensive experience preparing geospatial data for visual display, the use of sonification is less common. Beyond availability of tools or visual bias, an incomplete understanding of the implications of parameter mappings that cross conceptual data categories limits the application of sonification to geospatial data. To catalyze the use of audio in cartography, this paper explores existing examples of parameter mapping sonification through the framework of the geographic data cube. More widespread adoption of auditory displays would diversify map design techniques, enhance accessibility of geospatial data, and may also provide new perspective for application to non-geospatial data sets.","PeriodicalId":402143,"journal":{"name":"Proceedings of the 24th International Conference on Auditory Display - ICAD 2018","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129724853","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Planethesizer: Approaching Exoplanet Sonification","authors":"Adrián García Riber","doi":"10.21785/ICAD2018.008","DOIUrl":"https://doi.org/10.21785/ICAD2018.008","url":null,"abstract":"The creation of simulations, sounds and images based on information related to an object of investigation is currently a real tool used in multiple areas to bring the non-specialized public closer to scientific achievements and discoveries. Under this context of multimodal representations and simulations developed for educational and informational purposes, this work intends to build a bridge between virtual musical instruments’ development and physical models, using the gravitation laws of the seven planets orbiting around the Trappist-1 star. The following is a case study of an interdisciplinary conversion algorithm design that relates musical software synthesis to exoplanets’ astronomical data measured from the observed flux variations in the light curves of their star-and that tries to suggest a systematic and reproducible method, useful for any other planetary system or model-based virtual instrument design. As a result, the Virtual Interactive Synthesizer prototype Planethesizer is presented, whose default configurations display a multimodal Trappist-1, Kepler-444 and K2-72 planetary systems simulation.","PeriodicalId":402143,"journal":{"name":"Proceedings of the 24th International Conference on Auditory Display - ICAD 2018","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125430150","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"\"Musical Exercise” for People with Visual Impairments: A Preliminary Study with the Blindfolded","authors":"R. Khan, M. Jeon, Tejin Yoon","doi":"10.21785/ICAD2018.030","DOIUrl":"https://doi.org/10.21785/ICAD2018.030","url":null,"abstract":"Performing independent physical exercise is critical to maintain one's good health, but it is specifically hard for people with visual impairments. To address this problem, we have developed a Musical Exercise platform for people with visual impairments so that they can perform exercise in a good form consistently. We designed six different conditions, including blindfolded or visual without audio conditions, and blindfolded or visual with two different types of audio feedback (continuous vs. discrete) conditions. Eighteen sighted participants participated in the experiment, by doing two exercises - squat and wall sit with all six conditions. The results show that Musical Exercise is a usable exercise assistance system without any adverse effect on exercise completion time or perceived workload. Also, the results show that with a specific sound design (i.e., discrete), participants in the blindfolded condition can do exercise as consistently as participants in the non-blindfolded condition. This implies that not all sounds equally work and thus, care is required to refine auditory displays. Potentials and limitations of Musical Exercise and future works are discussed with the results.","PeriodicalId":402143,"journal":{"name":"Proceedings of the 24th International Conference on Auditory Display - ICAD 2018","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123375279","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Sonification and Science Pedagogy: Preliminary Experiences and Assessments of Earth Science Data Presented in an Undergraduate General Education Course","authors":"M. Ballora, C. Roman, R. Pockalny, K. Wishner","doi":"10.21785/ICAD2018.004","DOIUrl":"https://doi.org/10.21785/ICAD2018.004","url":null,"abstract":"This paper describes preliminary investigations into how sonifications of scientific graphs are perceived by undergraduate students in an introductory course in oceanography at the University of Rhode Island. The goal is to gather data that can assist in gauging students’ levels of engagement with sonification as a component of science education. The results, while preliminary, show promise that sonified graphs improve understanding, especially when they are presented in combination with visual graphs.","PeriodicalId":402143,"journal":{"name":"Proceedings of the 24th International Conference on Auditory Display - ICAD 2018","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121443838","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Auditory Displays of Electric Power Grids","authors":"P. Cowden, L. Dosiek","doi":"10.21785/ICAD2018.013","DOIUrl":"https://doi.org/10.21785/ICAD2018.013","url":null,"abstract":"This paper presents auditory displays of power grid voltage. Due to the constantly changing energy demands experienced by a power system, the voltage varies slightly about nominal, e.g., 120±2 V at 60±0.04 Hz. These variations are small enough that any audible effects, such as transformer hum, appear to have constant volume and pitch. Here, an audification technique is derived that amplifies the voltage variations and shifts the nominal frequency from 60 Hz to a common musical note. Sonification techniques are presented that map the voltage magnitude and frequency to MIDI velocity and pitch, and create a sampler trigger from frequency deviation. Several examples, including audio samples, are given under a variety of power system conditions. These results culminate in a multi-instrument track generated from the sonification of time-synchronized geographically widespread power grid measurements. In addition, an inexpensive Arduino-based device is detailed that allows for real-time sonification of wall outlet voltage.","PeriodicalId":402143,"journal":{"name":"Proceedings of the 24th International Conference on Auditory Display - ICAD 2018","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121245971","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Wave Space Sonification","authors":"T. Hermann","doi":"10.21785/ICAD2018.026","DOIUrl":"https://doi.org/10.21785/ICAD2018.026","url":null,"abstract":"This paper introduces Wave Space Sonification (WSS), a novel class of sonification techniques for time- (or space-) indexed data. WSS doesn’t fall into the classes of Audification, Parameter- Mapping Sonification or Model-based Sonification and thus constitutes a novel class of sonification techniques. It realizes a different link between data and their auditory representation, by scanning a scalar field – defined as wave space – along a data-driven trajectory. This allows both the highly controlled definition of the auditory representation for any area of interest, as well as subtle yet acoustically complex sound variations as the overall pattern changes. To illustrate Wave Space Sonification (WSS), we introduce three different WSS instances, (i) the Static Canonical WSS, (ii) Data-driven Localized WSS and (iii), Granular Wave Space Sonification (GWSS), and we demonstrate the different methods with sonification examples from various data domains. We discuss the technique and its relation to other sonification approaches and finally outline productive application areas.","PeriodicalId":402143,"journal":{"name":"Proceedings of the 24th International Conference on Auditory Display - ICAD 2018","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115588379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Samuel Chabot, Wendy Lee, Rebecca Elder, J. Braasch
{"title":"Using a Multimodal Immersive Environment to Investigate Perceptions in Augmented Virtual Reality Systems","authors":"Samuel Chabot, Wendy Lee, Rebecca Elder, J. Braasch","doi":"10.21785/ICAD2018.014","DOIUrl":"https://doi.org/10.21785/ICAD2018.014","url":null,"abstract":"The Collaborative-Research Augmented Immersive Virtual Environment Laboratory at Rensselaer is a state-of-the-art space that offers users the capabilities of multimodality and immersion. Realistic and abstract sets of data can be explored in a variety of ways, even in large group settings. This paper discusses the motivations of the immersive experience and the advantages over smaller scale and single-modality expressions of data. One experiment focuss on the influence of immersion on perceptions of architectural renderings. Its findings suggest disparities between participants’ judgment when viewing either two-dimensional printouts or the immersive CRAIVE-Lab screen. The advantages of multimodality are discussed in an experiment concerning abstract data exploration. Various auditory cues for aiding in visual data extraction were tested for their affects on participants’ speed and accuracy of information extraction. Finally, artificially generated auralizations are paired with recreations of realistic spaces to analyze the influences of immersive visuals on the perceptions of sound fields. One utilized method for creating these sound fields is a geometric ray-tracing model, which calculates the auditory streams of each individual loudspeaker in the lab to create a cohesive sound field representation of the visual space.","PeriodicalId":402143,"journal":{"name":"Proceedings of the 24th International Conference on Auditory Display - ICAD 2018","volume":"156 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116053198","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Recognizability and Perceived Urgency of Bicycle Bells","authors":"Lisa Frohmann, Marian Weger, Robert Höldrich","doi":"10.21785/ICAD2018.025","DOIUrl":"https://doi.org/10.21785/ICAD2018.025","url":null,"abstract":"Raising awareness about how alarm sounds are perceived and evaluated by an individual in traffic scenery is important for developing new alarm designs, as well as for improving existing ones. Bearing a positive contribution to road safety, cyclists and pedestrians especially can benefit from appropriate alarming bell and horn sounds. Primarily, the alarm signal should evoke a precise idea of what is the source of the warning and the desired reaction to it. Furthermore, it should not be masked by other noises thus going undetected by the ear. Finally, an appropriate warning signal should transmit the urgency of a given situation, while at the same time, it should not cause other road users and pedestrians to startle. In two listening experiments, we examined the perception of commonly available bicycle bells and horns. Average typicality or recognizability as a bicycle bell among other everyday sounds has been investigated through a free identification task. In a second experiment, we tested perceived urgency of the warning sounds in relation to traffic noise. This article further provides a survey on non-verbal alarm design, as well as an analysis of acoustic properties of common bicycle bells and horns. Consequently, a linear regression model presents the relationship between named properties and perceived urgency. It is our intention to give an insight into the often unattended but important issue of the perception of auditory warning sounds in our everyday acoustic environment.","PeriodicalId":402143,"journal":{"name":"Proceedings of the 24th International Conference on Auditory Display - ICAD 2018","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127132540","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Sonification: A Prehistory","authors":"David Worrall","doi":"10.21785/ICAD2018.019","DOIUrl":"https://doi.org/10.21785/ICAD2018.019","url":null,"abstract":"The idea that sound can convey information predates the modern era, and certainly the computational present. Data sonification can be broadly described as the creation, study and use of the non-speech aural representation of information to convey information. As a field of contemporary enquiry and practice, data sonification is young, interdisciplinary and evolving; existing in parallel to the field of data visualization. Drawing on older practices such as auditing, and the use of information messaging in music, this paper provides an historical understanding of how sound and its representational deployment in communicating information has changed. In doing so, it aims to encourage a critical awareness of some of the sociocultural as well as technical assumptions often adopted in sonifying data, especially those that have been developed in the context of Western music of the last half-century or so.","PeriodicalId":402143,"journal":{"name":"Proceedings of the 24th International Conference on Auditory Display - ICAD 2018","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127979445","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}