{"title":"Wearable sensors for real-time musical signal processing","authors":"A. Kapur, E.L. Yang, A. Tindale, P. Driessen","doi":"10.1109/PACRIM.2005.1517316","DOIUrl":null,"url":null,"abstract":"This paper describes the use of wearable sensor technology to control parameters of audio effects for real-time musical signal processing. Traditional instrument performance techniques are preserved while the system modifies the resulting sound based upon the movements of the performer. Gesture data from a performing artist is captured using three-axis accelerometer packages that is converted to MIDI (musical instrument digital interface) messages using microcontroller technology. ChucK, a new programming language for on-the-fly audio signal processing and sound synthesis, is used to collect and process synchronized gesture data and audio signals from the traditional instrument being performed. Case studies using the wearable sensors in a variety of locations on the body (head, hands, feet, etc.) with a number of different traditional instruments (tabla, sitar, drumset, turntables, etc.) are presented.","PeriodicalId":346880,"journal":{"name":"PACRIM. 2005 IEEE Pacific Rim Conference on Communications, Computers and signal Processing, 2005.","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2005-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"17","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"PACRIM. 2005 IEEE Pacific Rim Conference on Communications, Computers and signal Processing, 2005.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PACRIM.2005.1517316","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 17
Abstract
This paper describes the use of wearable sensor technology to control parameters of audio effects for real-time musical signal processing. Traditional instrument performance techniques are preserved while the system modifies the resulting sound based upon the movements of the performer. Gesture data from a performing artist is captured using three-axis accelerometer packages that is converted to MIDI (musical instrument digital interface) messages using microcontroller technology. ChucK, a new programming language for on-the-fly audio signal processing and sound synthesis, is used to collect and process synchronized gesture data and audio signals from the traditional instrument being performed. Case studies using the wearable sensors in a variety of locations on the body (head, hands, feet, etc.) with a number of different traditional instruments (tabla, sitar, drumset, turntables, etc.) are presented.