New Interfaces for Musical Expression最新文献

筛选
英文 中文
The cyclotactor: towards a tactile platform for musical interaction 环因子:朝向音乐互动的触觉平台
New Interfaces for Musical Expression Pub Date : 2021-07-27 DOI: 10.5281/zenodo.1179571
Staas de Jong
{"title":"The cyclotactor: towards a tactile platform for musical interaction","authors":"Staas de Jong","doi":"10.5281/zenodo.1179571","DOIUrl":"https://doi.org/10.5281/zenodo.1179571","url":null,"abstract":"This paper reports on work in progress on a finger-based tactile I/O device for musical interaction. Central to the device is the ability to set up cyclical relationships between tactile input and output. A direct practical application of this to musical interaction is given, using the idea to multiplex two degrees of freedom on a single tactile loop.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125533188","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Making Grains Tangible: Microtouch for Microsound 使谷物有形:Microtouch for Microsound
New Interfaces for Musical Expression Pub Date : 2021-07-27 DOI: 10.5281/zenodo.1178055
Staas de Jong
{"title":"Making Grains Tangible: Microtouch for Microsound","authors":"Staas de Jong","doi":"10.5281/zenodo.1178055","DOIUrl":"https://doi.org/10.5281/zenodo.1178055","url":null,"abstract":"This paper proposes a new research direction for the large family of instrumental musical interfaces where sound is generated using digital granular synthesis, and where interaction and control involve the (fine) operation of stiff, flat contact surfaces. First, within a historical context, a general absence of, and clear need for, tangible output that is dynamically instantiated by the grain-generating process itself is identified. Second, to fill this gap, a concrete general approach is proposed based on the careful construction of non-vibratory and vibratory force pulses, in a one-to-one relationship with sonic grains. An informal pilot psychophysics experiment initiating the approach was conducted, which took into account the two main cases for applying forces to the human skin: perpendicular, and lateral. Initial results indicate that the force pulse approach can enable perceivably multidimensional, tangible display of the ongoing grain-generating process. Moreover, it was found that this can be made to meaningfully happen (in real time) in the same timescale of basic sonic grain generation. This is not a trivial property, and provides an important and positive fundament for further developing this type of enhanced display. It also leads to the exciting prospect of making arbitrary sonic grains actual physical manipulanda.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117177833","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Ghostfinger: a novel platform for fully computational fingertip controllers Ghostfinger:一个全新的全计算指尖控制器平台
New Interfaces for Musical Expression Pub Date : 2021-07-27 DOI: 10.5281/zenodo.1176292
Staas de Jong
{"title":"Ghostfinger: a novel platform for fully computational fingertip controllers","authors":"Staas de Jong","doi":"10.5281/zenodo.1176292","DOIUrl":"https://doi.org/10.5281/zenodo.1176292","url":null,"abstract":"We present Ghostfinger, a technology for highly dynamic up/down fingertip haptics and control. The overall user experience offered by the technology can be described as that of tangibly and audibly interacting with a small hologram. More specifically, Ghostfinger implements automatic visualization of the dynamic instantiation/parametrization of algorithmic primitives that together determine the current haptic conditions for fingertip action. Some aspects of this visualization are visuospatial: A floating see-through cursor provides real-time, to-scale display of the fingerpad transducer, as it is being moved by the user. Simultaneously, each haptic primitive instance is represented by a floating block shape, type-colored, variably transparent, and possibly overlapping with other such block shapes. Further aspects of visualization are symbolic: Each instance is also represented by a type symbol, lighting up within a grid if the instance is providing output to the user. We discuss the system's user interface, programming interface, and potential applications. This is done from a general perspective that articulates and emphasizes the uniquely enabling role of the principle of computation in the implementation of new forms of instrumental control of musical sound. Beyond the currently presented technology, this also reflects more broadly on the role of Digital Musical Instruments (DMIs) in NIME.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115958556","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Developing the Cyclotactor 开发环因子
New Interfaces for Musical Expression Pub Date : 2021-07-27 DOI: 10.5281/zenodo.1177591
Staas de Jong
{"title":"Developing the Cyclotactor","authors":"Staas de Jong","doi":"10.5281/zenodo.1177591","DOIUrl":"https://doi.org/10.5281/zenodo.1177591","url":null,"abstract":"This paper presents developments in the technology underlying the cyclotactor, a finger-based tactile I/O device for musical interaction. These include significant improvements both in the basic characteristics of tactile interaction and in the related (vibro)tactile sample rates, latencies, and timing precision. After presenting the new prototype's tactile output force landscape, some of the new possibilities for interaction are discussed, especially those for musical interaction with zero audio/tactile latency.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"57 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131854934","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Discourse is critical: Towards a collaborative NIME history 话语是关键的:走向协作的NIME历史
New Interfaces for Musical Expression Pub Date : 2021-04-29 DOI: 10.21428/92FBEB44.AC5D43E1
S. Bin
{"title":"Discourse is critical: Towards a collaborative NIME history","authors":"S. Bin","doi":"10.21428/92FBEB44.AC5D43E1","DOIUrl":"https://doi.org/10.21428/92FBEB44.AC5D43E1","url":null,"abstract":"","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-04-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134567822","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
A Laptop Ensemble Performance System using Recurrent Neural Networks 基于递归神经网络的笔记本电脑集成性能系统
New Interfaces for Musical Expression Pub Date : 2020-12-03 DOI: 10.5281/zenodo.4813481
R. Proctor, Charles Patrick Martin
{"title":"A Laptop Ensemble Performance System using Recurrent Neural Networks","authors":"R. Proctor, Charles Patrick Martin","doi":"10.5281/zenodo.4813481","DOIUrl":"https://doi.org/10.5281/zenodo.4813481","url":null,"abstract":"The popularity of applying machine learning techniques in musical domains has created an inherent availability of freely accessible pre-trained neural network (NN) models ready for use in creative applications. This work outlines the implementation of one such application in the form of an assistance tool designed for live improvisational performances by laptop ensembles. The primary intention was to leverage off-the-shelf pre-trained NN models as a basis for assisting individual performers either as musical novices looking to engage with more experienced performers or as a tool to expand musical possibilities through new forms of creative expression. The system expands upon a variety of ideas found in different research areas including new interfaces for musical expression, generative music and group performance to produce a networked performance solution served via a web-browser interface. The final implementation of the system offers performers a mixture of high and low-level controls to influence the shape of sequences of notes output by locally run NN models in real time, also allowing performers to define their level of engagement with the assisting generative models. Two test performances were played, with the system shown to feasibly support four performers over a four minute piece while producing musically cohesive and engaging music. Iterations on the design of the system exposed technical constraints on the use of a JavaScript environment for generative models in a live music context, largely derived from inescapable processing overheads.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"409 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123967746","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Cross-modal terrains: navigating sonic space through haptic feedback 跨模态地形:通过触觉反馈导航声波空间
New Interfaces for Musical Expression Pub Date : 2020-12-02 DOI: 10.5281/zenodo.1176163
Gabriella Isaac, Lauren Hayes, T. Ingalls
{"title":"Cross-modal terrains: navigating sonic space through haptic feedback","authors":"Gabriella Isaac, Lauren Hayes, T. Ingalls","doi":"10.5281/zenodo.1176163","DOIUrl":"https://doi.org/10.5281/zenodo.1176163","url":null,"abstract":"This paper explores the idea of using virtual textural terrains as a means of generating haptic profiles for force-feedback controllers. This approach breaks from the para-digm established within audio-haptic research over the last few decades where physical models within virtual environments are designed to transduce gesture into sonic output. We outline a method for generating multimodal terrains using basis functions, which are rendered into monochromatic visual representations for inspection. This visual terrain is traversed using a haptic controller, the NovInt Falcon, which in turn receives force information based on the grayscale value of its location in this virtual space. As the image is traversed by a performer the levels of resistance vary, and the image is realized as a physical terrain. We discuss the potential of this approach to afford engaging musical experiences for both the performer and the audience as iterated through numerous performances.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"109 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115120505","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Trends at NIME - Reflections on Editing A NIME Reader NIME的趋势-编辑NIME阅读器的思考
New Interfaces for Musical Expression Pub Date : 2020-10-21 DOI: 10.5281/zenodo.1176044
A. Jensenius, Michael J. Lyons
{"title":"Trends at NIME - Reflections on Editing A NIME Reader","authors":"A. Jensenius, Michael J. Lyons","doi":"10.5281/zenodo.1176044","DOIUrl":"https://doi.org/10.5281/zenodo.1176044","url":null,"abstract":"This paper provides an overview of the process of editing the forthcoming anthology \"A NIME Reader - Fifteen Years of New Interfaces for Musical Expression.\" The selection process is presented, and we reflect on some of the trends we have observed in re-discovering the collection of more than 1200 NIME papers published throughout the 15-year long history of the conference. An anthology is necessarily selective, and ours is no exception. As we present in this paper, the aim has been to represent the wide range of artistic, scientific, and technological approaches that characterize the NIME conference. The anthology also includes critical discourse, and through acknowledgment of the strengths and weaknesses of the NIME community, we propose activities that could further diversify and strengthen the field.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124702238","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Click: : RAND. A Minimalist Sound Sculpture 单击::RAND。极简主义的声音雕塑
New Interfaces for Musical Expression Pub Date : 2020-09-14 DOI: 10.26686/wgtn.12955088
Paul Dunham
{"title":"Click: : RAND. A Minimalist Sound Sculpture","authors":"Paul Dunham","doi":"10.26686/wgtn.12955088","DOIUrl":"https://doi.org/10.26686/wgtn.12955088","url":null,"abstract":"Discovering outmoded or obsolete technologies and appropriating them in creative practice can uncover new relationships between those technologies. Using a media archaeological research approach, this paper presents the electromechanical relay and a book of random numbers as related forms of obsolete media. Situated within the context of electromechanical sound art, the work uses a non-deterministic approach to explore the non-linear and unpredictable agency and materiality of the objects in the work. Developed by the first author, Click::RAND is an object-based sound installation. The work has been developed as an audio- visual representation of a genealogy of connections between these two forms of media in the history of computing.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132528014","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Shaping the behaviour of feedback instruments with complexity-controlled gain dynamics 用复杂控制的增益动力学来塑造反馈仪器的行为
New Interfaces for Musical Expression Pub Date : 2020-09-01 DOI: 10.5281/zenodo.4813406
C. Kiefer, Dan Overholt, Alice C. Eldridge
{"title":"Shaping the behaviour of feedback instruments with complexity-controlled gain dynamics","authors":"C. Kiefer, Dan Overholt, Alice C. Eldridge","doi":"10.5281/zenodo.4813406","DOIUrl":"https://doi.org/10.5281/zenodo.4813406","url":null,"abstract":"Feedback instruments offer radical new ways of engaging with instrument design and musicianship. They are defined by recurrent circulation of signals through the instrument, which give the instrument ‘a life of its own’ and a ’stimulating uncontrollability’. Arguably, the most interesting musical behaviour in these instruments happens when their dynamic complexity is maximised, without falling into saturating feedback. It is often challenging to keep the instrument in this zone; this research looks at algorithmic ways to manage the behaviour of feedback loops in order to make feedback instruments more playable and musical; to expand and maintain the ‘sweet spot’. We propose a solution that manages gain dynamics based on measurement of complexity, using a realtime implementation of the Effort to Compress algorithm. The system was evaluated with four musicians, each of whom have different variations of string-based feedback instruments, following an autobiographical design approach. Qualitative feedback was gathered, showing that the system was successful in modifying the behaviour of these instruments to allow easier access to edge transition zones, sometimes at the expense of losing some of the more compelling dynamics of the instruments. The basic efficacy of the system is evidenced by descriptive audio analysis. This paper is accompanied by a dataset of sounds collected during the study, and the open source software that was written to support the research.","PeriodicalId":161317,"journal":{"name":"New Interfaces for Musical Expression","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128256820","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信