{"title":"Both Rudimentary Visualization and Prototypical Sonification can Serve as a Benchmark to Evaluate New Sonification Designs","authors":"Tim Ziemer, Holger Schultheis","doi":"10.1145/3561212.3561228","DOIUrl":"https://doi.org/10.1145/3561212.3561228","url":null,"abstract":"Comparing sonification with visualization is like comparing apples and oranges. While visualizations are ubiquitous to the public and have established names, principles, application areas, and sophisticated designs, sonifications tend to be unique, self-made and completely new to users. In this study we developed a rudimentary visualization that is related closely to the principle of the sonification designs that we want to evaluate. In addition, we implemented a prototypical sonification that uses the most common mapping principles. Experiment results show that participants perform similarly well using the rudimentary visualization and the prototypical sonification, which is much better than chance but significantly worse than using our new sonification design. We therefore argue that both rudimentary visualization and prototypical sonifications can serve as a suitable benchmark to evaluate new sonifications designs against.","PeriodicalId":379319,"journal":{"name":"Proceedings of the 17th International Audio Mostly Conference","volume":"57 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116805621","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"How to build pipe organ robots","authors":"Michael J. Krzyzaniak","doi":"10.1145/3561212.3561225","DOIUrl":"https://doi.org/10.1145/3561212.3561225","url":null,"abstract":"In this paper I describe how I built three miniature computer-controlled pipe organs, using accessible digital fabrication techniques. I was motivated to build them as part of my ongoing research on musical robot swarms, which necessitated simplified, easy-to-build organs. Having done that, the goal of this paper is to present the average computer musician with the equations, software, information, and key materials that I used, so that they can build their own pipe organs with a low barrier to entry, minimal assembly time, and using standard digital fabrication equipment. Finally, I describe a few simple algorithms for imbuing the pipe organs with a small amount of self-awareness which will facilitate their use in common computer-music scenarios. The completed pipe organs can be seen in the video below, and in several other videos referenced throughout this paper.","PeriodicalId":379319,"journal":{"name":"Proceedings of the 17th International Audio Mostly Conference","volume":"159 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121052017","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Cadavid, M. Møller, S. Bech, T. Waterschoot, Jan Østergaard
{"title":"Performance of Low Frequency Sound Zones Based on Truncated Room Impulse Responses","authors":"J. Cadavid, M. Møller, S. Bech, T. Waterschoot, Jan Østergaard","doi":"10.1145/3561212.3561248","DOIUrl":"https://doi.org/10.1145/3561212.3561248","url":null,"abstract":"Using spatially distributed loudspeakers and properly designed control filters, it is possible to generate sound zones that play different audio contents in different regions of the same room. For low frequency content, the design of control filters relies on the room impulse responses (RIRs) between each loudspeaker and the desired listening positions. Estimates of the RIRs can be obtained by distributing wireless microphones within the sound zones and thereby systematically acquiring sufficient knowledge about the acoustical characteristics of the room and loudspeakers. Longer acquisition times would generally lead to better estimates of the RIRs but would also introduce processing delays, which is undesirable in cases where time-varying RIRs are to be compensated. In addition, shorter RIRs may imply lower computational complexity. In this work, control filters were calculated using truncated versions of the RIRs in order to simulate the effect of a reduced acquisition time. The performance was evaluated in terms of the acoustic contrast ratio (ACR) and it was seen that within a certain limit, doubling the acquisition time increases the ACR around 4 dB or more. Moreover, to keep the sound pressure low in the dark zone, only a limited part of the reverberation tail is required.","PeriodicalId":379319,"journal":{"name":"Proceedings of the 17th International Audio Mostly Conference","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126991822","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Towards a Sustainable Internet of Sounds","authors":"L. Gabrielli, L. Turchet","doi":"10.1145/3561212.3561246","DOIUrl":"https://doi.org/10.1145/3561212.3561246","url":null,"abstract":"The Internet of Sounds (IoS) is an emerging research area at the intersection of engineering fields and humanities including computing, communication technology, audio signal processing, acoustic monitoring, music and arts. Although this research field is expected to have beneficial impacts on society through entertainment, creativity, well-being, monitoring and security, it is paramount to be aware of the adverse impact of current technology on the environment in terms of greenhouse gases emissions, pollution and soil consumption. In this study we provide a survey of the environmental issues produced by current information and communication technology (ICT) and relate these to the use cases that the IoS envisions. On the basis of this survey, we identify some key aspects to reduce the footprint of IoS services and products and then we provide suggestions to make advancements in IoS environment-aware.","PeriodicalId":379319,"journal":{"name":"Proceedings of the 17th International Audio Mostly Conference","volume":"2013 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127324767","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Celia Moosbrugger, Katharina Groß-Vogt, Marian Weger
{"title":"Real-time Button Display and Chord Verification – an Interactive Learning App for the Diatonic Accordion","authors":"Celia Moosbrugger, Katharina Groß-Vogt, Marian Weger","doi":"10.1145/3561212.3561218","DOIUrl":"https://doi.org/10.1145/3561212.3561218","url":null,"abstract":"We present the QuetschnApp, a prototype of a digital learning environment for the Austrian diatonic accordion. The app displays the respective buttons of a music piece consecutively, and verifies the played chords using pitch detection. The prototype has been evaluated in two user-studies with 25 participants. It turned out that learning a new music piece with the QuetschnApp works well and is generally well accepted. The prototype is transferable to other instruments of the accordion family.","PeriodicalId":379319,"journal":{"name":"Proceedings of the 17th International Audio Mostly Conference","volume":"197 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134439137","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An Investigation into the State of Mind of Recipients when Listening to a Nature Soundscape in Virtual Reality","authors":"Maria von Hösslin, Cornelius Pöpel","doi":"10.1145/3561212.3561247","DOIUrl":"https://doi.org/10.1145/3561212.3561247","url":null,"abstract":"A study is presented which investigates whether audiovisual nature soundscape recordings in virtual reality can have positive effects on subjective well-being. Twenty people from different occupational and age groups were selected for the study. They were presented with a VR application with a 3D nature soundscape recording. To exclude the placebo effect, the test subjects listened to their favourite music instead of the soundscape recording in a second examination. Subjective well-being was recorded by means of several questionnaires and physical well-being by measuring blood pressure and pulse. The results show that the soundscape recording had a significant positive effect on subjective and physical well-being even compared to the favourite music of the participants.","PeriodicalId":379319,"journal":{"name":"Proceedings of the 17th International Audio Mostly Conference","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132864040","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Matching auditory and visual room size, distance, and source orientation in virtual reality","authors":"M. Frank, Djordje Perinovic","doi":"10.1145/3561212.3561216","DOIUrl":"https://doi.org/10.1145/3561212.3561216","url":null,"abstract":"Matching visual and auditory cues in virtual reality (VR) is important to provide plausibility and create the impression of presence in a scene. This paper presents an experiment in VR, in which participants match acoustic and visual room size, distance, and orientation of a directive sound source in a simulated concert hall. The simulation is fully interactive and allows the participants to move with 6 degrees of freedom. For all three parameters, the experiment was done in both directions: adjusting the acoustic parameters to given visual settings and adjusting the visual parameters to given acoustic settings. The results show that the adjustment generally works in both directions. However, for distance the auditory adjustment works better and does not reveal the typical compression. Regarding room size, results agree with just noticeable differences in reverberation time known from real-world experiments.","PeriodicalId":379319,"journal":{"name":"Proceedings of the 17th International Audio Mostly Conference","volume":"248 2-8","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120931020","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Dance phrase onsets and endings in an interactive dance study","authors":"Andreas Bergsland","doi":"10.1145/3561212.3561242","DOIUrl":"https://doi.org/10.1145/3561212.3561242","url":null,"abstract":"The paper describes a work-in-progress exploring the expressive and creative potential of dance phrase onsets and endings in interactive dance, using an artistic research approach. It briefly delineates the context of the presented work, before describing the technical setup applied, both in terms of hardware and software. The main part of the paper is concerned with the specific mappings of three different sections in the performance that the project resulted in. Subsequently, the process and performance are evaluated, including both the dancer’s feedback and observations by the author. The points from the evaluation are then discussed with reference to relevant research literature. Findings include that the dancer experienced an increased awareness of beginnings and endings in different sections of the performance, and that postural adjustments were necessary to make the interaction more robust.","PeriodicalId":379319,"journal":{"name":"Proceedings of the 17th International Audio Mostly Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129602574","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Exploring profiling and personalisation in sleep music design: towards conceptualising musical sleep aids for hospital use","authors":"S. Monache, Doudou Jia, Daan Kamphuis, E. Özcan","doi":"10.1145/3561212.3561245","DOIUrl":"https://doi.org/10.1145/3561212.3561245","url":null,"abstract":"Music as a low-cost sleep aid is a promising way to improve the sleep quality of people. However, most available sleep music playlists are limited to generic, soothing songs, which do not take in account personalisation. In collaboration with the Neurology Department of the Reinier de Graaf hospital (Delft, The Netherlands), we explored a profile-based personalisation approach to deliver music that fits with people’ sleep and music preferences. Through generative research, we collected people’s preference data and proposed four, evocative sleep music profiles: the Explorer, the Diver, the Hunter, and the Observer. The results of the profiling evaluation suggest that the profile experience is credible, intuitive, and easy to use. Four profiles can reflect people’s preferences, but may not be stable.","PeriodicalId":379319,"journal":{"name":"Proceedings of the 17th International Audio Mostly Conference","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116733271","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}