Ning Yang, Richard J. Savery, Raghavasimhan Sankaranarayanan, Lisa Zahray, Gil Weinberg
{"title":"Mechatronics-Driven Musical Expressivity for Robotic Percussionists","authors":"Ning Yang, Richard J. Savery, Raghavasimhan Sankaranarayanan, Lisa Zahray, Gil Weinberg","doi":"10.5281/zenodo.4813274","DOIUrl":"https://doi.org/10.5281/zenodo.4813274","url":null,"abstract":"Musical expressivity is an important aspect of musical performance for humans as well as robotic musicians. We present a novel mechatronics-driven implementation of Brushless Direct Current (BLDC) motors in a robotic marimba player, named Shimon, designed to improve speed, dynamic range (loudness), and ultimately perceived musical expressivity in comparison to state-of-the-art robotic percussionist actuators. In an objective test of dynamic range, we find that our implementation provides wider and more consistent dynamic range response in comparison with solenoid-based robotic percussionists. Our implementation also outperforms both solenoid and human marimba players in striking speed. In a subjective listening test measuring musical expressivity, our system performs significantly better than a solenoid-based system and is statistically indistinguishable from human performers.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127347617","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Rodger, P. Stapleton, M. V. Walstijn, Miguel Ortiz, Laurel Pardue
{"title":"What Makes a Good Musical Instrument? A Matter of Processes, Ecologies and Specificities","authors":"M. Rodger, P. Stapleton, M. V. Walstijn, Miguel Ortiz, Laurel Pardue","doi":"10.5281/zenodo.4813438","DOIUrl":"https://doi.org/10.5281/zenodo.4813438","url":null,"abstract":"Understanding the question of what makes a good musical instrument raises several conceptual challenges. Researchers have regularly adopted tools from traditional HCI as a framework to address this issue, in which instrumental musical activities are taken to comprise a device and a user, and should be evaluated as such. We argue that this approach is not equipped to fully address the conceptual issues raised by this question. It is worth reflecting on what exactly an instrument is, and how instruments contribute toward meaningful musical experiences. Based on a theoretical framework that incorporates ideas from ecological psychology, enactivism, and phenomenology, we propose an alternative approach to studying musical instruments. According to this approach, instruments are better understood in terms of processes rather than as devices, while musicians are not users, but rather agents in musical ecologies. A consequence of this reframing is that any evaluations of instruments, if warranted, should align with the specificities of the relevant processes and ecologies concerned. We present an outline of this argument and conclude with a description of a current research project to illustrate how our approach can shape the design and performance of a musical instrument in-progress.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128341425","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Design for auditory imagery: altering instruments to explore performer fluency","authors":"A. Guidi, Fabio Morreale, Andrew Mcpherson","doi":"10.5281/zenodo.4813260","DOIUrl":"https://doi.org/10.5281/zenodo.4813260","url":null,"abstract":"In NIME design, thorough attention has been devoted to feedback modalities, including auditory, visual and hap-tic feedback. How the performer executes the gestures to achieve a sound on an instrument, by contrast, appears to be less examined. Previous research showed that auditory imagery, or the ability to hear or recreate sounds in the mind even when no audible sound is present, is essential to the sensorimotor control involved in playing an instrument. In this paper, we enquire whether auditory imagery can also help to support skill transfer between musical instruments resulting in possible implications for new instrument design. To answer this question, we performed two experimental studies on pitch accuracy and fluency where professional violinists were asked to play a modified violin. Results showed altered or even possibly irrelevant auditory feedback on a modified violin does not appear to be a significant impediment to performance. However, performers need to have coherent imagery of what they want to do, and the sonic outcome needs to be coupled to the motor program to achieve it. This finding shows that the design paradigm should be shifted from a direct feedback model of instrumental playing toward a model where imagery guides the playing process. This result is in agreement with recent research on skilled sensorimotor control that highlights the value of feedforward anticipation in embodied musical performance. It is also of primary importance for the design of new instruments: new sounds that cannot easily be imagined and that are not coupled to a motor program are not likely to be easily performed on the instrument.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123887527","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Fabio Morreale, S. Bin, Andrew Mcpherson, P. Stapleton, M. Wanderley
{"title":"A NIME Of The Times: Developing an Outward-Looking Political Agenda For This Community","authors":"Fabio Morreale, S. Bin, Andrew Mcpherson, P. Stapleton, M. Wanderley","doi":"10.5281/zenodo.4813294","DOIUrl":"https://doi.org/10.5281/zenodo.4813294","url":null,"abstract":"So far, NIME research has been mostly inward-looking, ded-icated to divulging and studying our own work and hav-ing limited engagement with trends outside our community. Though musical instruments as cultural artefacts are inherently political, we have so far not sufficiently engaged with confronting these themes in our own research. In this paper we argue that we should consider how our work is also political, and begin to develop a clear political agenda that includes social, ethical, and cultural considerations through which to consider not only our own musical instruments, but also those not created by us. Failing to do so would re-sult in an unintentional but tacit acceptance and support of such ideologies. We explore one item to be included in this political agenda: the recent trend in music technology of “democratising music”, which carries implicit political ideologies grounded in techno-solutionism. We conclude with a number of recommendations for stimulating community-wide discussion on these themes in the hope that this leads to the development of an outward-facing perspective that fully engages with political topics.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121486269","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Taking Back Control: Taming the Feral Cello","authors":"Tom Davis, L. Reid","doi":"10.5281/zenodo.4813453","DOIUrl":"https://doi.org/10.5281/zenodo.4813453","url":null,"abstract":"Whilst there is a large body of NIME papers that concentrate on the presentation of new technologies there are fewer papers that have focused on a longitudinal understanding of NIMEs in practice. This paper embodies the more recent acknowledgement of the importance of practice-based methods of evaluation [1,2,3,4] concerning the use of NIMEs within performance and the recognition that it is only within the situation of practice that the context is available to actually interpret and evaluate the instrument [2]. Within this context this paper revisits the Feral Cello performance system that was first presented at NIME 2017 [5]. This paper explores what has been learned through the artistic practice of performing and workshopping in this context by drawing heavily on the experiences of the performer/composer who has become an integral part of this project and co-author of this paper. The original philosophical context is also revisited and reflections are made on the tensions between this position and the need to ‘get something to work’. The authors feel the presentation of the semi-structured interview within the paper is the best method of staying truthful to Hayes understanding of musical improvisation as an enactive framework ‘in its ability to demonstrate the importance of participatory, relational, emergent, and embodied musical activities and processes’ [4].","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124296306","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Survey on the Use of 2D Touch Interfaces for Musical Expression","authors":"Diemo Schwarz, A. Liu, Frédéric Bevilacqua","doi":"10.5281/zenodo.4813318","DOIUrl":"https://doi.org/10.5281/zenodo.4813318","url":null,"abstract":"Expressive 2D multi-touch interfaces have in recent years moved from research prototypes to industrial products, from repurposed generic computer input devices to controllers specially designed for musical expression. A host of practitioners use this type of devices in many different ways, with different gestures and sound synthesis or transformation methods. In order to get an overview of existing and desired usages, we launched an on-line survey that collected 37 answers from practitioners in and outside of academic and design communities. In the survey we inquired about the participants' devices, their strengths and weaknesses , the layout of control dimensions, the used gestures and mappings, the synthesis software or hardware, and the use of audio descriptors and machine learning. The results can inform the design of future interfaces, gesture analysis and mapping, and give directions for the need and use of machine learning for user adaptation.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129113345","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Adapting & Openness: Dynamics of Collaboration Interfaces for Heterogeneous Digital Orchestras","authors":"Florent Berthaut, Luke Dahl","doi":"10.5281/zenodo.4813241","DOIUrl":"https://doi.org/10.5281/zenodo.4813241","url":null,"abstract":"Advanced musical cooperation, such as concurrent control of musical parameters or sharing data between instruments, has previously been investigated using multiuser instruments or orchestras of identical instruments. In the case of heterogeneous digital orchestras, where the instruments, interfaces, and control gestures can be very different, a number of issues may impede such collaboration opportunities. These include the lack of a standard method for sharing data or control, the incompatibility of parameter types, and limited awareness of other musicians' activity and instrument structure. As a result, most collaborations remain limited to synchronising tempo or applying effects to audio outputs. In this paper we present two interfaces for real-time group collaboration amongst musicians with heterogeneous instruments. We conducted a qualitative study to investigate how these interfaces impact musicians' experience and their musical output, we performed a thematic analysis of interviews , and we analysed logs of interactions. From these results we derive principles and guidelines for the design of advanced collaboration systems for heterogeneous digital orchestras, namely Adapting (to) the System, Support Development , Default to Openness, and Minimise Friction to Support Expressivity.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115062355","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Star Interpolator - A Novel Visualization Paradigm for Graphical Interpolators","authors":"D. Gibson, R. Polfreman","doi":"10.5281/zenodo.4813168","DOIUrl":"https://doi.org/10.5281/zenodo.4813168","url":null,"abstract":"This paper presents a new visualization paradigm for graphical \u0000interpolation systems, known as Star Interpolation, that has been \u0000specifically created for sound design applications. Through the \u0000presented investigation of previous visualizations, it becomes apparent \u0000that the existing visuals in this class of system, generally relate to the \u0000interpolation model that determines the weightings of the presets and \u0000not the sonic output. The Star Interpolator looks to resolve this \u0000deficiency by providing visual cues that relate to the parameter space. \u0000Through comparative exploration it has been found this visualization \u0000provides a number of benefits over the previous systems. It is also \u0000shown that hybrid visualizations can be generated that combine the \u0000benefits of the new visualization with the existing interpolation \u0000models. These can then be accessed by using an Interactive \u0000Visualization (IV) approach. The results fromour exploration of these \u0000visualizations are encouraging and they appear to be advantageous \u0000when using the interpolators for sound designs tasks.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"104 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124051806","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Elemental: a Gesturally Controlled System to Perform Meteorological Sounds","authors":"Tiago Brizolara, S. Gibet, Caroline Larboulette","doi":"10.5281/zenodo.4813483","DOIUrl":"https://doi.org/10.5281/zenodo.4813483","url":null,"abstract":"In this paper, we present and evaluate Elemental , a NIME (New Interface for Musical Expression) based on audio synthesis of sounds of meteorological phenomena, namely rain, wind and thunder, intended for application in contempo-rary music/sound art, performing arts and entertainment. We first describe the system, controlled by the performer’s arms through Inertial Measuring Units and Electromyography sensors. The produced data is analyzed and used through mapping strategies as input of the sound synthesis engine. We conducted user studies to refine the sound synthesis engine, the choice of gestures and the mappings between them, and to finally evaluate this proof of concept. Indeed, the users approached the system with their own awareness ranging from the manipulation of abstract sound to the direct simulation of atmospheric phenomena - in the latter case, it could even be to revive memories or to create novel situations. This suggests that the approach of instrumentalization of sounds of known source may be a fruitful strategy for constructing expressive interactive sonic systems.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117216998","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Performing Audiences: Composition Strategies for Network Music using Mobile Phones","authors":"Anna Xambó","doi":"10.5281/zenodo.4813192","DOIUrl":"https://doi.org/10.5281/zenodo.4813192","url":null,"abstract":"With the development of web audio standards, it has quickly become technically easy to develop and deploy software for inviting audiences to participate in musical performances using their mobile phones. Thus, a new audience-centric musical genre has emerged, which aligns with artistic mani-festations where there is an explicit inclusion of the public (e.g. participatory art, cinema or theatre). Previous research has focused on analysing this new genre from historical, social organisation and technical perspectives. This follow-up paper contributes with reflections on technical and aesthetic aspects of composing within this audience-centric approach. We propose a set of 13 composition dimensions that deal with the role of the performer, the role of the audience, the location of sound and the type of feedback, among others. From a reflective approach, four participatory pieces developed by the authors are analysed using the proposed dimensions. Finally, we discuss a set of recommendations and challenges for the composers-developers of this new and promising musical genre. This paper concludes discussing the implications of this research for the NIME community.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124613319","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}