{"title":"Exploring relationships between effort, motion, and sound in new musical instruments","authors":"Çağrı Erdem, Qichao Lan, A. Jensenius","doi":"10.17011/HT/URN.202011256767","DOIUrl":null,"url":null,"abstract":"We investigated how the action–sound relationships found in electric guitar performance can be used in the design of new instruments. Thirty-one trained guitarists performed a set of basic sound-producing actions (impulsive, sustained, and iterative) and free improvisations on an electric guitar. We performed a statistical analysis of the muscle activation data (EMG) and audio recordings from the experiment. Then we trained a long short-term memory network with nine different configurations to map EMG signal to sound. We found that the preliminary models were able to predict audio energy features of free improvisations on the guitar, based on the dataset of raw EMG from the basic soundproducing actions. The results provide evidence of similarities between body motion and sound in music performance, compatible with embodied music cognition theories. They also show the potential of using machine learning on recorded performance data in the design of new musical instruments.","PeriodicalId":37614,"journal":{"name":"Human Technology","volume":"1 1","pages":"310-347"},"PeriodicalIF":0.0000,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Human Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.17011/HT/URN.202011256767","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 3
Abstract
We investigated how the action–sound relationships found in electric guitar performance can be used in the design of new instruments. Thirty-one trained guitarists performed a set of basic sound-producing actions (impulsive, sustained, and iterative) and free improvisations on an electric guitar. We performed a statistical analysis of the muscle activation data (EMG) and audio recordings from the experiment. Then we trained a long short-term memory network with nine different configurations to map EMG signal to sound. We found that the preliminary models were able to predict audio energy features of free improvisations on the guitar, based on the dataset of raw EMG from the basic soundproducing actions. The results provide evidence of similarities between body motion and sound in music performance, compatible with embodied music cognition theories. They also show the potential of using machine learning on recorded performance data in the design of new musical instruments.
期刊介绍:
Human Technology is an interdisciplinary, multiscientific journal focusing on the human aspects of our modern technological world. The journal provides a forum for innovative and original research on timely and relevant topics with the goal of exploring current issues regarding the human dimension of evolving technologies and, then, providing new ideas and effective solutions for addressing the challenges. Focusing on both everyday and professional life, the journal is equally interested in, for example, the social, psychological, educational, cultural, philosophical, cognitive scientific, and communication aspects of human-centered technology. Special attention shall be paid to information and communication technology themes that facilitate and support the holistic human dimension in the future information society.