Cagan Arslan, Florent Berthaut, J. Martinet, Ioan Marius Bilasco, L. Grisoni
{"title":"The Phone with the Flow: Combining Touch + Optical Flow in Mobile Instruments","authors":"Cagan Arslan, Florent Berthaut, J. Martinet, Ioan Marius Bilasco, L. Grisoni","doi":"10.5281/zenodo.1302709","DOIUrl":null,"url":null,"abstract":"Mobile devices have been a promising platform for musical performance thanks to the various sensors readily available on board. In particular, mobile cameras can provide rich input as they can capture a wide variety of user gestures or environment dynamics. However, this raw camera input only provides continuous parameters and requires expensive computation. In this paper, we propose combining camera based motion/gesture input with the touch input, in order to filter movement information both temporally and spatially , thus increasing expressiveness while reducing computation time. We present a design space which demonstrates the diversity of interactions that our technique enables. We also report the results of a user study in which we observe how musicians appropriate the interaction space with an example instrument.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"74 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"New Interfaces for Musical Expression","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5281/zenodo.1302709","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Mobile devices have been a promising platform for musical performance thanks to the various sensors readily available on board. In particular, mobile cameras can provide rich input as they can capture a wide variety of user gestures or environment dynamics. However, this raw camera input only provides continuous parameters and requires expensive computation. In this paper, we propose combining camera based motion/gesture input with the touch input, in order to filter movement information both temporally and spatially , thus increasing expressiveness while reducing computation time. We present a design space which demonstrates the diversity of interactions that our technique enables. We also report the results of a user study in which we observe how musicians appropriate the interaction space with an example instrument.