Journal of New Music Research最新文献

筛选
英文 中文
From acceleration to rhythmicity: Smartphone-assessed movement predicts properties of music 从加速到节奏:智能手机评估的动作预测音乐的特性
IF 1.1 4区 计算机科学
Journal of New Music Research Pub Date : 2020-01-30 DOI: 10.1080/09298215.2020.1715447
M. Irrgang, J. Steffens, Hauke Egermann
{"title":"From acceleration to rhythmicity: Smartphone-assessed movement predicts properties of music","authors":"M. Irrgang, J. Steffens, Hauke Egermann","doi":"10.1080/09298215.2020.1715447","DOIUrl":"https://doi.org/10.1080/09298215.2020.1715447","url":null,"abstract":"ABSTRACT Querying music is still a disembodied process in Music Information Retrieval. Thus, the goal of the presented study was to explore how free and spontaneous movement captured by smartphone accelerometer data can be related to musical properties. Motion features related to tempo, smoothness, size, and regularity were extracted and shown to predict the musical qualities ‘rhythmicity’ (R² = .45), ‘pitch level + range’ (R² = .06) and ‘complexity (R² = .15). We conclude that (rhythmic) music properties can be predicted from movement, and that an embodied approach to MIR is feasible.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"49 1","pages":"178 - 191"},"PeriodicalIF":1.1,"publicationDate":"2020-01-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/09298215.2020.1715447","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46160146","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Explaining harmonic inter-annotator disagreement using Hugo Riemann's theory of ‘harmonic function’ 用Hugo Riemann的“调和函数”理论解释调和注释者之间的分歧
IF 1.1 4区 计算机科学
Journal of New Music Research Pub Date : 2020-01-29 DOI: 10.1080/09298215.2020.1716811
Anna Selway, Hendrik Vincent Koops, A. Volk, D. Bretherton, Nicholas Gibbins, R. Polfreman
{"title":"Explaining harmonic inter-annotator disagreement using Hugo Riemann's theory of ‘harmonic function’","authors":"Anna Selway, Hendrik Vincent Koops, A. Volk, D. Bretherton, Nicholas Gibbins, R. Polfreman","doi":"10.1080/09298215.2020.1716811","DOIUrl":"https://doi.org/10.1080/09298215.2020.1716811","url":null,"abstract":"ABSTRACT Harmonic transcriptions by ear rely heavily on subjective perceptions, which can lead to disagreement between annotators. The current computational metrics employed to measure annotator disagreement are useful for determining similarity on a pitch-class level, but are agnostic to the functional properties of chords. In contrast, music theories like Hugo Riemann's theory of ‘harmonic function’ acknowledge the similarity between chords currently unrecognised by computational metrics. This paper, utilises Riemann's theory to explain the harmonic annotator disagreements in the Chordify Annotator Subjectivity Dataset. This theory allows us to explain 82% of the dataset, compared to the 66% explained using pitch-class based methods alone. This new interdisiplinary application of Riemann's theory increases our understanding of harmonic disagreement and introduces a method for improving harmonic evaluation metrics that takes into account the function of a chord in relation to a tonal centre.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"49 1","pages":"136 - 150"},"PeriodicalIF":1.1,"publicationDate":"2020-01-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/09298215.2020.1716811","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46547520","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
A comparative study of verbal descriptions of emotions induced by music between adults with and without visual impairments 有视觉障碍和无视觉障碍的成年人对音乐引起的情感的言语描述的比较研究
IF 1.1 4区 计算机科学
Journal of New Music Research Pub Date : 2020-01-27 DOI: 10.1080/09298215.2020.1717544
H. Park, S. Lee, H. Chong
{"title":"A comparative study of verbal descriptions of emotions induced by music between adults with and without visual impairments","authors":"H. Park, S. Lee, H. Chong","doi":"10.1080/09298215.2020.1717544","DOIUrl":"https://doi.org/10.1080/09298215.2020.1717544","url":null,"abstract":"ABSTRACT This study aimed to investigate the differences in verbal descriptions of emotions induced by music between adults who are visually impaired (VI) and adults who have normal vision (NV). Thirty participants (15 VI, 15 NV) listened to music excerpts and were interviewed. A content analysis and a syntactic analysis were performed. Among the VI group, contextual verbalism was more highly observed compared to media or educational verbalism and a high ratio of affective words, expressions and descriptions via senses other than vision was found. The VI more frequently employed situational descriptions while the NV more often described episodic memories.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"49 1","pages":"151 - 161"},"PeriodicalIF":1.1,"publicationDate":"2020-01-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/09298215.2020.1717544","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43434575","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Creative autonomy in a simple interactive music system 一个简单的交互式音乐系统中的创造性自主
IF 1.1 4区 计算机科学
Journal of New Music Research Pub Date : 2020-01-21 DOI: 10.1080/09298215.2019.1709510
Fabio Paolizzo, Colin G. Johnson
{"title":"Creative autonomy in a simple interactive music system","authors":"Fabio Paolizzo, Colin G. Johnson","doi":"10.1080/09298215.2019.1709510","DOIUrl":"https://doi.org/10.1080/09298215.2019.1709510","url":null,"abstract":"ABSTRACT Can autonomous systems be musically creative without musical knowledge? Assumptions from interdisciplinary studies on self-reflection are evaluated using Video Interactive VST Orchestra, a system that generates music from audio and video inputs through an analysis of video motion and simultaneous sound processing. The system is able to generate material that is primary, novel and contextual. A case study provides evidence that these three simple features allow the system to identify musical salience in the material that it is generating, and for the system to act as an autonomous musical agent.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"49 1","pages":"115 - 125"},"PeriodicalIF":1.1,"publicationDate":"2020-01-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/09298215.2019.1709510","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43165199","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
The influence of the vocal tract on the attack transients in clarinet playing. 单簧管演奏中声道对进攻瞬变的影响。
IF 1.1 4区 计算机科学
Journal of New Music Research Pub Date : 2020-01-20 eCollection Date: 2020-01-01 DOI: 10.1080/09298215.2019.1708412
Montserrat Pàmies-Vilà, Alex Hofmann, Vasileios Chatziioannou
{"title":"The influence of the vocal tract on the attack transients in clarinet playing.","authors":"Montserrat Pàmies-Vilà,&nbsp;Alex Hofmann,&nbsp;Vasileios Chatziioannou","doi":"10.1080/09298215.2019.1708412","DOIUrl":"https://doi.org/10.1080/09298215.2019.1708412","url":null,"abstract":"<p><p>When playing single-reed woodwind instruments, players can modulate the spectral content of the airflow in their vocal tract, upstream of the vibrating reed. In an empirical study with professional clarinettists ( <math> <msub><mrow><mi>N</mi></mrow> <mrow><mrow><mi>p</mi></mrow> </mrow> </msub> <mo>=</mo> <mn>11</mn></math> ), blowing pressure and mouthpiece pressure were measured during the performance of Clarinet Concerto excerpts. By comparing mouth pressure and mouthpiece pressure signals in the time domain, a method to detect instances of vocal tract adjustments was established. Results showed that players tuned their vocal tract in both clarion and altissimo registers. Furthermore, the analysis revealed that vocal tract adjustments support shorter attack transients and help to avoid lower bore resonances.</p>","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"49 2","pages":"126-135"},"PeriodicalIF":1.1,"publicationDate":"2020-01-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/09298215.2019.1708412","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37807891","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Dance to your own drum: Identification of musical genre and individual dancer from motion capture using machine learning 跟着自己的鼓跳舞:使用机器学习从动作捕捉中识别音乐类型和舞者个体
IF 1.1 4区 计算机科学
Journal of New Music Research Pub Date : 2020-01-13 DOI: 10.1080/09298215.2020.1711778
Emily Carlson, Pasi Saari, Birgitta Burger, P. Toiviainen
{"title":"Dance to your own drum: Identification of musical genre and individual dancer from motion capture using machine learning","authors":"Emily Carlson, Pasi Saari, Birgitta Burger, P. Toiviainen","doi":"10.1080/09298215.2020.1711778","DOIUrl":"https://doi.org/10.1080/09298215.2020.1711778","url":null,"abstract":"ABSTRACT Machine learning has been used to accurately classify musical genre using features derived from audio signals. Musical genre, as well as lower-level audio features of music, have also been shown to influence music-induced movement, however, the degree to which such movements are genre-specific has not been explored. The current paper addresses this using motion capture data from participants dancing freely to eight genres. Using a Support Vector Machine model, data were classified by genre and by individual dancer. Against expectations, individual classification was notably more accurate than genre classification. Results are discussed in terms of embodied cognition and culture.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"49 1","pages":"162 - 177"},"PeriodicalIF":1.1,"publicationDate":"2020-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/09298215.2020.1711778","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42187939","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 23
Automatic melody harmonization with triad chords: A comparative study 三和弦自动旋律协调的比较研究
IF 1.1 4区 计算机科学
Journal of New Music Research Pub Date : 2020-01-08 DOI: 10.1080/09298215.2021.1873392
Yin-Cheng Yeh, Wen-Yi Hsiao, Satoru Fukayama, Tetsuro Kitahara, Benjamin Genchel, Hao-Min Liu, Hao-Wen Dong, Yian Chen, T. Leong, Yi-Hsuan Yang
{"title":"Automatic melody harmonization with triad chords: A comparative study","authors":"Yin-Cheng Yeh, Wen-Yi Hsiao, Satoru Fukayama, Tetsuro Kitahara, Benjamin Genchel, Hao-Min Liu, Hao-Wen Dong, Yian Chen, T. Leong, Yi-Hsuan Yang","doi":"10.1080/09298215.2021.1873392","DOIUrl":"https://doi.org/10.1080/09298215.2021.1873392","url":null,"abstract":"The task of automatic melody harmonization aims to build a model that generates a chord sequence as the harmonic accompaniment of a given multiple-bar melody sequence. In this paper, we present a comparative study evaluating the performance of canonical approaches to this task, including template matching, hidden Markov model, genetic algorithm and deep learning. The evaluation is conducted on a dataset of 9226 melody/chord pairs, considering 48 different triad chords. We report the result of an objective evaluation using six different metrics and a subjective study with 202 participants, showing that a deep learning method performs the best.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"50 1","pages":"37 - 51"},"PeriodicalIF":1.1,"publicationDate":"2020-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/09298215.2021.1873392","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49388382","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 37
Audio-first VR: New perspectives on musical experiences in virtual environments 音频优先的VR:虚拟环境中音乐体验的新视角
IF 1.1 4区 计算机科学
Journal of New Music Research Pub Date : 2020-01-01 DOI: 10.1080/09298215.2019.1707234
Anil Çamci, R. Hamilton
{"title":"Audio-first VR: New perspectives on musical experiences in virtual environments","authors":"Anil Çamci, R. Hamilton","doi":"10.1080/09298215.2019.1707234","DOIUrl":"https://doi.org/10.1080/09298215.2019.1707234","url":null,"abstract":"ABSTRACT This special issue of the Journal of New Music Research explores VR (Virtual Reality) through the lenses of music, art and technology, each focusing on foregrounded sonic expression – an audio-first VR, wherein sound is treated not only as an integral part of immersive virtual experiences but also as a critical point of departure for creative and technological work in this domain. In this article, we identify emerging challenges and opportunities in audio-first VR, and pose questions pertaining to both theoretical and practical aspects of this concept. We then discuss how each contribution to our special issue addresses these questions through research and artistic projects, giving us a glimpse into the future of audio in VR.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"49 1","pages":"1 - 7"},"PeriodicalIF":1.1,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/09298215.2019.1707234","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42975208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Concordia: A musical XR instrument for playing the solar system Concordia:一种用于演奏太阳系的XR乐器
IF 1.1 4区 计算机科学
Journal of New Music Research Pub Date : 2020-01-01 DOI: 10.1080/09298215.2020.1714666
K. Snook, T. Barri, Monica Bolles, Petter Ericson, Carl Fravel, J. Goßmann, Susan E. Green-Mateu, Andrew Luck, M. Schedel, Robert Thomas
{"title":"Concordia: A musical XR instrument for playing the solar system","authors":"K. Snook, T. Barri, Monica Bolles, Petter Ericson, Carl Fravel, J. Goßmann, Susan E. Green-Mateu, Andrew Luck, M. Schedel, Robert Thomas","doi":"10.1080/09298215.2020.1714666","DOIUrl":"https://doi.org/10.1080/09298215.2020.1714666","url":null,"abstract":"ABSTRACT Kepler Concordia, a new scientific and musical instrument enabling players to explore the solar system and other data within immersive extended-reality (XR) platforms, is being designed by a diverse team of musicians, artists, scientists and engineers using audio-first principles. The core instrument modules will be launched in 2019 for the 400th anniversary of Johannes Kepler's Harmonies of the World, in which he laid out a framework for the harmony of geometric form as well as the three laws of planetary motion. Kepler's own experimental process can be understood as audio-first because he employed his understanding of Western Classical music theory to investigate and discover the heliocentric, elliptical behaviour of planetary orbits. Indeed, principles of harmonic motion govern much of our physical world and show up at all scales in mathematics and physics. Few physical systems, however, offer such rich harmonic complexity and beauty as our own solar system. Concordia is a musical instrument that is modular, extensible and designed to allow players to generate and explore transparent sonifications of planetary movements rooted in the musical and mathematical concepts of Johannes Kepler as well as researchers who have extended Kepler's work, such as Hartmut Warm. Its primary function is to emphasise the auditory experience by encouraging musical explorations using sonification of geometric and relational information of scientifically accurate planetary ephemeris and astrodynamics. Concordia highlights harmonic relationships of the solar system through interactive sonic immersion. This article explains how we prioritise data sonification and then add visualisations and gamification to create a new type of experience and creative distributed-ledger powered ecosystem. Kepler Concordia facilitates the perception of music while presenting the celestial harmonies through multiple senses, with an emphasis on hearing, so that, as Kepler wrote, ‘the mind can seize upon the patterns’.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"49 1","pages":"103 - 88"},"PeriodicalIF":1.1,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/09298215.2020.1714666","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47869105","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
3D interaction techniques for musical expression 音乐表达的3D交互技术
IF 1.1 4区 计算机科学
Journal of New Music Research Pub Date : 2020-01-01 DOI: 10.1080/09298215.2019.1706584
Florent Berthaut
{"title":"3D interaction techniques for musical expression","authors":"Florent Berthaut","doi":"10.1080/09298215.2019.1706584","DOIUrl":"https://doi.org/10.1080/09298215.2019.1706584","url":null,"abstract":"As Virtual Reality headsets become accessible, more and more artistic applications are developed, including immersive musical instruments. 3D interaction techniques designed in the 3D User Interfaces research community, such as navigation, selection and manipulation techniques, open numerous opportunities for musical control. For example, navigation techniques such as teleportation, free walking/flying and path-planning enable different ways of accessing musical scores, scenes of spatialised sound sources or even parameter spaces. Manipulation techniques provide novel gestures and metaphors, e.g. for drawing or sculpting sound entities. Finally, 3D selection techniques facilitate the interaction with complex visual structures which can represent hierarchical temporal structures, audio graphs, scores or parameter spaces. However, existing devices and techniques were developed mainly with a focus on efficiency, i.e. minimising error rate and task completion times. They were therefore not designed with the specifics of musical interaction in mind. In this paper, we review existing 3D interaction techniques and examine how they can be used for musical control, including the possibilities they open for instrument designers. We then propose a number of research directions to adapt and extend 3DUIs for musical expression","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"49 1","pages":"60 - 72"},"PeriodicalIF":1.1,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/09298215.2019.1706584","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45221830","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信