{"title":"Surveying digital musical instrument use in active practice","authors":"John Sullivan, C. Guastavino, M. Wanderley","doi":"10.1080/09298215.2022.2029912","DOIUrl":"https://doi.org/10.1080/09298215.2022.2029912","url":null,"abstract":"Digital musical instruments are frequently designed in research and experimental performance contexts but few are taken up into sustained use by active and professional musicians. To identify the needs of performers who use novel technologies in their practices, a survey of musicians was conducted that identified desirable qualities for instruments to be viable in active use, along with attributes for successful uptake and continued use of instruments based on frameworks of long and short term user engagement. The findings are presented as a set of design considerations towards the development of instruments intended for use by active and professional performers.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"50 1","pages":"469 - 486"},"PeriodicalIF":1.1,"publicationDate":"2021-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45260278","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Back to the present: Assimilation of late 19th century performance features among currently active violinists","authors":"Eitan Ornoy, Shai Cohen","doi":"10.1080/09298215.2022.2029496","DOIUrl":"https://doi.org/10.1080/09298215.2022.2029496","url":null,"abstract":"Present-day inquiries into aspects of 19th century performance style mark the growing quest to revive practices of post-1800 music repertoire. This paper aims to trace whether there be found an impact of recordings made by 19th century violinists of coeval repertoire on current performers who've recorded the same works. Early, intermediate, and present-day recordings (N = 81) of three late-romantic compositions were analyzed for the manner of execution of varied performance features. While similarities between early and current period players were traced to a certain extent, several early period distinctives are still rather absent from prevalent praxis. Results may shed light on performance style and interpretation of late C19 violin repertoire and on the influence of sonic documentation on 21st century players.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"50 1","pages":"413 - 427"},"PeriodicalIF":1.1,"publicationDate":"2021-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42965769","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Individualized interpretation: Exploring structural and interpretive effects on evaluations of emotional content in Bach’s Well Tempered Clavier","authors":"A. Battcock, Michael Schutz","doi":"10.1080/09298215.2021.1979050","DOIUrl":"https://doi.org/10.1080/09298215.2021.1979050","url":null,"abstract":"Audiences, juries, and critics continually evaluate performers based on their interpretations of familiar classics. Yet formally assessing the perceptual consequences of interpretive decisions is challenging – particularly with respect to how they shape emotional messages. Here, we explore the issue through comparison of emotion ratings (using scales of arousal and valence) for excerpts of all 48 pieces from Bach’s Well-Tempered Clavier. In this series of studies, participants evaluated one of seven interpretations by highly regarded pianists. This work offers the novel ability to simultaneously explore (1) how different interpretations by expert pianists shape emotional messages, (2) the degree to which structural and interpretative elements shape the clarity of emotional messages, and (3) how interpretative differences affect the strength of specific features or cues to convey musical emotion.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"50 1","pages":"447 - 468"},"PeriodicalIF":1.1,"publicationDate":"2021-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46452891","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Laura Bishop, Victor González Sánchez, B. Laeng, A. Jensenius, Simon Høffding
{"title":"Move like everyone is watching: Social context affects head motion and gaze in string quartet performance","authors":"Laura Bishop, Victor González Sánchez, B. Laeng, A. Jensenius, Simon Høffding","doi":"10.1080/09298215.2021.1977338","DOIUrl":"https://doi.org/10.1080/09298215.2021.1977338","url":null,"abstract":"Ensemble musicians engage with each other visually through glances and body motion. We conducted a case study to test how string quartet musicians would respond to playing conditions that were meant to discourage or promote visually communicative behaviour. A quartet performed in different seating configurations under rehearsal and concert conditions. Quantity of head motion was reduced when musicians' gaze was constrained. Differences in gaze and body motion between musicians reflected their musical roles in the ensemble. Overall, our findings suggest that gaze and motion dynamics vary within and between performances in response to changing musical, situational and social factors.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"50 1","pages":"392 - 412"},"PeriodicalIF":1.1,"publicationDate":"2021-08-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46904137","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Analysing the relationship between tone and melody in Chaozhou songs","authors":"Xi Zhang, I. Cross","doi":"10.1080/09298215.2021.1974490","DOIUrl":"https://doi.org/10.1080/09298215.2021.1974490","url":null,"abstract":"This paper uses corpus analysis to explore relationships between tone and melody in folk and contemporary songs in Chaozhou, a Chinese dialect with eight lexical tones and a wealth of tone sandhi. Results suggest that: (1) there is a high degree of correspondence between tone and melody in Chaozhou song; (2) tone sandhi influences tone-melody correspondence; (3) tones realised in context can be categorised into high-, mid-, and low-pitch groups according to the tone-pitch extreme rather than final pitch; (4) when single tones are performed melismatically across groups of notes, relationships between initial notes of successive groups shapes tone-melody matching.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"50 1","pages":"299 - 311"},"PeriodicalIF":1.1,"publicationDate":"2021-08-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46341613","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Breaking down the musician’s minds: How small changes in the musical instrument can impair your musical performance","authors":"Luiz Naveda, Marília Nunes-Silva","doi":"10.1080/09298215.2021.1973511","DOIUrl":"https://doi.org/10.1080/09298215.2021.1973511","url":null,"abstract":"The relationship between musicians and their musical instruments has influenced music engagement and musical structure across societies. In this work, we study how musicians react to changes in their instrument and the associations between keys and pitches using experiments that simulate the interface of the accordion. Seventeen accordionists, pianists and guitarists took part in the study. The results show accordion players are more affected by the changes in the musical interface than non-according players, for the same tasks. These observations support the extended cognition hypothesis, which proposes that coupled processes, such as the musician-instrument chain, count as an entire cognitive process.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"50 1","pages":"373 - 391"},"PeriodicalIF":1.1,"publicationDate":"2021-08-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48856928","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Humphreys, K. Sidorov, Andrew Jones, David Marshall
{"title":"An investigation of music analysis by the application of grammar-based compressors","authors":"D. Humphreys, K. Sidorov, Andrew Jones, David Marshall","doi":"10.1080/09298215.2021.1978505","DOIUrl":"https://doi.org/10.1080/09298215.2021.1978505","url":null,"abstract":"Many studies have presented computational models of musical structure, as an important aspect of musicological analysis. However, the use of grammar-based compressors to automatically recover such information is a relatively new and promising technique. We investigate their performance extensively using a collection of nearly 8000 scores, on tasks including error detection, classification, and segmentation, and compare this with a range of more traditional compressors. Further, we detail a novel method for locating transcription errors based on grammar compression. Despite its lack of domain knowledge, we conclude that grammar-based compression offers competitive performance when solving a variety of musicological tasks.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"50 1","pages":"312 - 341"},"PeriodicalIF":1.1,"publicationDate":"2021-08-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44966941","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Griffiths, Stuart Cunningham, Jonathan Weinel, R. Picking
{"title":"A multi-genre model for music emotion recognition using linear regressors","authors":"D. Griffiths, Stuart Cunningham, Jonathan Weinel, R. Picking","doi":"10.1080/09298215.2021.1977336","DOIUrl":"https://doi.org/10.1080/09298215.2021.1977336","url":null,"abstract":"ABSTRACT Making the link between human emotion and music is challenging. Our aim was to produce an efficient system that emotionally rates songs from multiple genres. To achieve this, we employed a series of online self-report studies, utilising Russell's circumplex model. The first study (n = 44) identified audio features that map to arousal and valence for 20 songs. From this, we constructed a set of linear regressors. The second study (n = 158) measured the efficacy of our system, utilising 40 new songs to create a ground truth. Results show our approach may be effective at emotionally rating music, particularly in the prediction of valence.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"50 1","pages":"355 - 372"},"PeriodicalIF":1.1,"publicationDate":"2021-08-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47452290","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Recognition of emotions in music through the Adaptive-Network-Based Fuzzy (ANFIS)","authors":"Paulo Sergio da Conceição Moreira, D. Tsunoda","doi":"10.1080/09298215.2021.1977339","DOIUrl":"https://doi.org/10.1080/09298215.2021.1977339","url":null,"abstract":"This study aims to recognise emotions in music through the Adaptive-Network-Based Fuzzy (ANFIS). For this, we applied such structure in 877 MP3 files with thirty seconds duration each, collected directly on the YouTube platform, which represent the emotions anger, fear, happiness, sadness, and surprise. We developed four classification strategies, consisting of sets of five, four, three, and two emotions. The results were considered promising, especially for three and two emotions, whose highest hit rates were 65.83% for anger, happiness and sadness, and 88.75% for anger and sadness. A reduction in the hit rate was observed when the emotions fear and happiness were in the same set, raising the hypothesis that only the audio content is not enough to distinguish between these emotions. Based on the results, we identified potential in the application of the ANFIS framework for problems with uncertainty and subjectivity.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"50 1","pages":"342 - 354"},"PeriodicalIF":1.1,"publicationDate":"2021-08-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44382446","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Luca Danieli, Maria A. G. Witek, Christopher Haworth
{"title":"Space, sonic trajectories and the perception of cadence in electroacoustic music","authors":"Luca Danieli, Maria A. G. Witek, Christopher Haworth","doi":"10.1080/09298215.2021.1927116","DOIUrl":"https://doi.org/10.1080/09298215.2021.1927116","url":null,"abstract":"This paper reports on an exploratory study in the field of electroacoustic music aimed at understanding whether a sensation similar to that associated with the concept of “cadence” in relation to tonal music can be identified when listening to sounds diffused in space. Using a variety of patterned stimuli in a perceptual experiment, we asked listeners to evaluate the completeness of multiple trajectories on the horizontal plane. The results show differences across multiple categories of listeners, and suggest that listeners acquainted with spatial music consider trajectories more complete when presenting the last two impulses at opposite directions from the centre.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"50 1","pages":"266 - 278"},"PeriodicalIF":1.1,"publicationDate":"2021-05-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/09298215.2021.1927116","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45274813","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}