Andrea Lorena Aldana Blanco, S. Grautoff, T. Hermann
{"title":"CardioSounds: Real-time Auditory Assistance for Supporting Cardiac Diagnostic and Monitoring","authors":"Andrea Lorena Aldana Blanco, S. Grautoff, T. Hermann","doi":"10.1145/3123514.3123542","DOIUrl":"https://doi.org/10.1145/3123514.3123542","url":null,"abstract":"This paper presents a real-time sonification system for Electrocardiography (ECG) monitoring and diagnostic. We introduce two novel sonification designs: (a) Auditory magnification loupe, a method to sonify important beat-to-beat variations when doing sports activities, and (b) ST-segment water ambience sonification, which aims to assist clinicians in the diagnostic process by building a soundscape that exhibits ECG signal abnormalities as the analysed signal deviates from a healthy ECG. The proposed methods were designed to assist users to unobtrusively monitor their own (or their patients') heart signal in situations when a visual-only representation is not convenient for the proper fulfilment of a given task. Using CardioSounds users receive auditory feedback in order to monitor important heart rhythm disturbances (e.g. Arrhythmia) or pathologies due to a blocking of the heart's vessels.","PeriodicalId":282371,"journal":{"name":"Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128038060","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mood Visualiser: Augmented Music Visualisation Gauging Audience Arousal","authors":"Anand Subramaniam, M. Barthet","doi":"10.1145/3123514.3123517","DOIUrl":"https://doi.org/10.1145/3123514.3123517","url":null,"abstract":"In many Western performing art traditions, audiences have been ascribed a position of receiver not bearing creative roles. Such perception of audiences sees a radical shift in contemporary participatory art which fosters a greater creative involvement and appropriation of audiences. In this paper, we augment the music listening experience through sound-reactive visuals which vary according to listeners' arousal response. Our approach seeks to involve audiences in the live generation of visuals accompanying music, giving participants an indirect, yet creative, role. We present an implementation of our concept, Mood Visualiser, a web application which receives data from a wireless bio-sensing platform used to characterise the arousal response of listeners at the physiological level. Mood Visualiser provides an artistic visual representation of the music and the listening experience derived from both the audio and physiological domains. We describe and evaluate a use case that focuses on a listener's electro-dermal activity (EDA), a correlate of arousal (or excitement). Feedback received in user surveys was overall positive and we identified further design challenges around the visual expression of emotions perceived in music, and the suitability of sensor interfaces during the music listening activity.","PeriodicalId":282371,"journal":{"name":"Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences","volume":"168 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132436185","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Hug, Serge Petralito, Sarah Hauser, Anna Lamprou, A. Repenning, Didier Bertschinger, Nadine Stüber, Markus Cslovjecsek
{"title":"Exploring Computational Music Thinking in a Workshop Setting with Primary and Secondary School Children","authors":"D. Hug, Serge Petralito, Sarah Hauser, Anna Lamprou, A. Repenning, Didier Bertschinger, Nadine Stüber, Markus Cslovjecsek","doi":"10.1145/3123514.3123515","DOIUrl":"https://doi.org/10.1145/3123514.3123515","url":null,"abstract":"Motivated by the essential role of music in children's lives, the potential of sound as sensory modality, and the importance of teaching Computational Thinking, there is great pedagogical potential in the integration of musical and computational thinking into \"Computational Music Thinking\". In this paper we report a pilot study exploring research and design approaches in creating learning environments and tools, which stimulate the interest of children and adolescents for both computer science and music in a sustainable way in the context of creative, self-guided activities. For the purposes of the study, two online tools, AgentCubes online, a 3D game design environment, and Ludosonica, an interactive music composition and performance system, were employed in a series of workshops designed for primary and secondary school children. Results from the study generally confirm the pedagogical potential of Computational Music Thinking and point toward promising future research directions.","PeriodicalId":282371,"journal":{"name":"Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132239037","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Aural Fabric: an interactive textile sonic map","authors":"A. Milo, J. Reiss","doi":"10.1145/3123514.3123565","DOIUrl":"https://doi.org/10.1145/3123514.3123565","url":null,"abstract":"The Aural Fabric is an interactive textile sonic map created to promote engagement in acoustic awareness towards the built environment. It fosters discussions on the aural environment of our cities by allowing users to experience binaural recordings captured during a soundwalk. The touch of the conductive areas embroidered of the surface on the map can be sensed by two capacitive boards stitched on the map. These are externally connected to an embedded computer processing unit, Bela. The recordings can be intuitively mixed together offering exploratory and performative recall of the material collected.","PeriodicalId":282371,"journal":{"name":"Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126653876","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Distance in audio for VR: constraints and opportunities","authors":"A. McArthur, M. Sandler, R. Stewart","doi":"10.1145/3123514.3123530","DOIUrl":"https://doi.org/10.1145/3123514.3123530","url":null,"abstract":"Spatial audio is enjoying a surge in attention in both scene and object based paradigms, due to the trend for, and accessibility of, immersive experience. This has been enabled through convergence in computing enhancements, component size reduction, and associated price reductions. For the first time, applications such as virtual reality (VR) are technologies for the consumer. Audio for VR is captured to provide a counterpart to the video or animated image, and can be rendered to combine elements of physical and psychoacoustic modelling, as well as artistic design. Given that distance is an inherent property of spatial audio, that it can augment sound's efficacy in cueing user attention (a problem which practitioners are seeking to solve), and that conventional film sound practices have intentionally exploited its use, the absence of research on its implementation and effects in immersive environments is notable. This paper sets out the case for its importance, from a perspective of research and practice. It focuses on cinematic VR, whose challenges for spatialized audio are clear, and at times stretches beyond the restrictions specific to distance in audio for VR, into more general audio constraints.","PeriodicalId":282371,"journal":{"name":"Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126959681","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}