Music PerceptionPub Date : 2022-09-01DOI: 10.1525/mp.2022.40.1.55
Patrik N. Juslin, Laura S. Sakka, G. Barradas, O. Lartillot
{"title":"Emotions, Mechanisms, and Individual Differences in Music Listening","authors":"Patrik N. Juslin, Laura S. Sakka, G. Barradas, O. Lartillot","doi":"10.1525/mp.2022.40.1.55","DOIUrl":"https://doi.org/10.1525/mp.2022.40.1.55","url":null,"abstract":"Emotions have been found to play a paramount role in both everyday music experiences and health applications of music, but the applicability of musical emotions depends on: 1) which emotions music can induce, 2) how it induces them, and 3) how individual differences may be explained. These questions were addressed in a listening test, where 44 participants (aged 19–66 years) reported both felt emotions and subjective impressions of emotion mechanisms (Mec Scale), while listening to 72 pieces of music from 12 genres, selected using a stratified random sampling procedure. The results showed that: 1) positive emotions (e.g., happiness) were more prevalent than negative emotions (e.g., anger); 2) Rhythmic entrainment was the most and Brain stem reflex the least frequent of the mechanisms featured in the BRECVEMA theory; 3) felt emotions could be accurately predicted based on self-reported mechanisms in multiple regression analyses; 4) self-reported mechanisms predicted felt emotions better than did acoustic features; and 5) individual listeners showed partly different emotion-mechanism links across stimuli, which may help to explain individual differences in emotional responses. Implications for future research and applications of musical emotions are discussed.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2022-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43131392","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2022-09-01DOI: 10.1525/mp.2022.40.1.12
Cecilia Taher
{"title":"Children’s Sensitivity to Performance Expression and its Relationship to Children’s Empathy","authors":"Cecilia Taher","doi":"10.1525/mp.2022.40.1.12","DOIUrl":"https://doi.org/10.1525/mp.2022.40.1.12","url":null,"abstract":"Emotional communication is central to music performance expression and empathy. Research has shown that music activities can enhance empathy in children and that more empathic adults can more accurately recognize and feel performers’ expressive intentions. Nevertheless, little is known about performance expression during childhood and the specific music-related factors affecting empathy development. This paper explores children’s sensitivity to a performer’s expressive or mechanical intentions and its relationship to children’s everyday empathy. Twenty-seven children listened to expressive and mechanical versions of Romantic flute excerpts with and without accompanying video, rating their perceived level of the performer’s expression and their enjoyment of the performance. The results indicate that children recognize performers’ intended expression or lack thereof and enjoy expressive performances more than mechanical ones. Children aged 10–12 recognized performance expression better than those aged 8–9, especially in audiovisual conditions. Children with higher cognitive empathy rated performance expression more in line with their enjoyment of the performance, which was also more concordant with the performer’s expressive intention. The findings support a relationship between music and socio-emotional skills and emphasize the importance of the visual component of music performance for children, an aspect that has received little attention among researchers and educators.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2022-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47736552","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2022-09-01DOI: 10.1525/mp.2022.40.1.27
C. Trevor, J. Devaney, David Huron
{"title":"Musicians Can Reliably Discriminate Between String Register Locations on the Violoncello","authors":"C. Trevor, J. Devaney, David Huron","doi":"10.1525/mp.2022.40.1.27","DOIUrl":"https://doi.org/10.1525/mp.2022.40.1.27","url":null,"abstract":"Vocal range location is an important vocal affective signal. Humans use different areas of their vocal range to communicate emotional intensity. Consequently, humans are good at identifying where someone is speaking within their vocal range. Research on music and emotion has demonstrated that musical expressive behaviors often reflect or take inspiration from vocal expressive behaviors. Is it possible for musicians to utilize range-related signals on their instrument similarly to how humans use vocal range-related signals? Might musicians therefore be similarly sensitive to instrumental range location? We present two experiments that investigate musicians’ ability to hear instrumental range location, specifically string register location on the violoncello. Experiment 1 is a behavioral study that tests whether musicians can reliably distinguish between higher and lower string register locations. In Experiment 2, we analyze acoustic features that could be impacted by string register location. Our results support the conjecture that musicians can reliably discriminate between string register locations, although perhaps only when vibrato is utilized. Our results also suggest that higher string register locations have a darker timbre and possibly a wider and faster vibrato. Further research on whether musicians can effectively imitate vocal range location signals with their instruments is warranted.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2022-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43958090","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2022-09-01DOI: 10.1525/mp.2022.40.1.3
G. Kreutz, Anja-Xiaoxing Cui
{"title":"Music Empathizing and Music Systemizing are Associated with Music Listening Reward","authors":"G. Kreutz, Anja-Xiaoxing Cui","doi":"10.1525/mp.2022.40.1.3","DOIUrl":"https://doi.org/10.1525/mp.2022.40.1.3","url":null,"abstract":"Music empathizing (ME) and music systemizing (MS) are constructs representing cognitive styles that address different facets of interest in music listening. Here we investigate whether ME and MS are positively associated with feelings of reward in response to music listening (MR). We conducted an online-survey in which n = 202 (127 identifying as female) participants, Mage = 26.06 years, SDage = 8.66 years, filled out the Music-Empathizing-Music-Systemizing (MEMS) Inventory, the Barcelona Questionnaire of Music Reward (BMRQ), further music-related inventories, and ad hoc items representing general interest and investment in music listening. Results from a conditional inference tree analysis confirm our hypothesis by showing ME followed by MS were the most important predictors of MR. In addition, subscription to music streaming services and investing free time into music listening were also associated with higher MR. These results suggest that perceiving reward through music listening is a function of both music empathizing and music systemizing. The nonsignificant contributions of music sophistication and music style preferences deny a larger role of these factors in MR. Further research is needed to investigate the interrelationships of musical cognitive styles and MR to refine our understanding of the affective value of music listening.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2022-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43539198","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2022-09-01DOI: 10.1525/mp.2022.40.1.39
Viola Pausch, Nina Düvel, R. Kopiez
{"title":"You Can Tell a Prodigy From a Professional Musician","authors":"Viola Pausch, Nina Düvel, R. Kopiez","doi":"10.1525/mp.2022.40.1.39","DOIUrl":"https://doi.org/10.1525/mp.2022.40.1.39","url":null,"abstract":"According to Feldman (1993), musical prodigies are expected to perform at the same high level as professional adult musicians and, therefore, are indistinguishable from adults. This widespread definition was the basis for the study by Comeau et al. (2017), which investigated if participants could determine whether an audio sample was played by a professional pianist or a child prodigy. Our paper is a replication of this previous study under more controlled conditions. Our main findings partly confirmed the previous findings: Comparable to Comeau et al.’s (2017) study (N = 51), the participants in our study (N = 278) were able to discriminate between prodigies and adult professionals by listening to music recordings of the same pieces. The overall discrimination performance was slightly above chance (correct responses: 53.7%; sensitivity d’ = 0.20), which was similar to Comeau et al.’s (2017) results of the identification task with prodigies aged between 11 and 14 years (approximately 54.6% correct responses; sensitivity approximately d’ = 0.13). Contrary to the original study, musicians and pianists in our study did not perform significantly better than other participants. Nevertheless, it is generally possible for listeners to differentiate prodigies from adult performers—although this is a demanding task.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2022-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41591905","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2022-06-01DOI: 10.1525/mp.2022.39.5.423
T. Matthews, Maria A. G. Witek, J. Thibodeau, P. Vuust, V. Penhune
{"title":"Perceived Motor Synchrony With the Beat is More Strongly Related to Groove Than Measured Synchrony","authors":"T. Matthews, Maria A. G. Witek, J. Thibodeau, P. Vuust, V. Penhune","doi":"10.1525/mp.2022.39.5.423","DOIUrl":"https://doi.org/10.1525/mp.2022.39.5.423","url":null,"abstract":"The sensation of groove can be defined as the pleasurable urge to move to rhythmic music. When moving to the beat of a rhythm, both how well movements are synchronized to the beat, and the perceived difficulty in doing so, are associated with groove. Interestingly, when tapping to a rhythm, participants tend to overestimate their synchrony, suggesting a potential discrepancy between perceived and measured synchrony, which may impact their relative relation with groove. However, these relations, and the influence of syncopation and musicianship on these relations, have yet to be tested. Therefore, we asked participants to listen to 50 drum patterns with varying rhythmic complexity and rate their sensation of groove. They then tapped to the beat of the same drum patterns and rated how well they thought their taps synchronized with the beat. Perceived synchrony showed a stronger relation with groove ratings than measured synchrony and syncopation, and this effect was strongest for medium complexity rhythms. We interpret these results in the context of meter-based temporal predictions. We propose that the certainty of these predictions determine the weight and number of movements that are perceived as synchronous and thus reflect rewarding prediction confirmations.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45414521","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2022-06-01DOI: 10.1525/mp.2022.39.5.468
Victor Rosi, O. Houix, N. Misdariis, P. Susini
{"title":"Investigating the Shared Meaning of Metaphorical Sound Attributes","authors":"Victor Rosi, O. Houix, N. Misdariis, P. Susini","doi":"10.1525/mp.2022.39.5.468","DOIUrl":"https://doi.org/10.1525/mp.2022.39.5.468","url":null,"abstract":"Music or sound professionals use specific terminology to communicate about timbre. Some key terms do not come from the sound domain and do not have a clear definition due to their metaphorical nature. This work aims to reveal shared meanings of four well-used timbre attributes: bright, warm, round, and rough. We conducted two complementary studies with French sound and music experts (e.g., composers, sound engineers, sound designers, musicians, etc.). First, we led interviews to gather definitions and instrumental sound examples for the four attributes (N = 32). Second, using an online survey, we tested the relevance and consensus on multiple descriptions most frequently evoked during the interviews (N = 51). The analysis of the rich corpus of verbalizations from the interviews yielded the main description strategies used by the experts, namely acoustic, metaphorical, and source-related. We also derived definitions for the attributes based on significantly relevant and consensual descriptions according to the survey results. Importantly, the definitions rely heavily on metaphorical descriptions. In sum, this study presents an overview of the shared meaning and perception of four metaphorical timbre attributes in the French language.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47699724","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2022-06-01DOI: 10.1525/mp.2022.39.5.484
Brendon Mizener, W. Dowling
{"title":"Real-Time Modulation Perception in Western Classical Music","authors":"Brendon Mizener, W. Dowling","doi":"10.1525/mp.2022.39.5.484","DOIUrl":"https://doi.org/10.1525/mp.2022.39.5.484","url":null,"abstract":"The task of music listening involves an auditory scene analysis in which the listener makes judgments related to melody, harmony, and consonance or dissonance, all of which are made within the context of key or tonic region. Here we examine whether the process of tracking key region is independent of the process of tracking surface cues, and what surface cues may influence that process. To this end, highly trained, moderately trained, and untrained listeners listened to excerpts from string quartets, quintets, and sextets from the classical and romantic eras and responded when they heard a modulation. Each excerpt featured either a pivot chord modulation, a direct modulation, a common tone modulation, or no modulation. Listeners performed above chance across modulation conditions, and an interaction effect was observed for modulation type and participant training level. We also present an exploratory PCA that suggests that harmonic language and phrasing are both significant factors in guiding modulation perception, both of which merit further investigation.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44445169","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2022-06-01DOI: 10.1525/mp.2022.39.5.503
G. Sioros, G. Madison, Diogo Cocharro, A. Danielsen, F. Gouyon
{"title":"Syncopation and Groove in Polyphonic Music","authors":"G. Sioros, G. Madison, Diogo Cocharro, A. Danielsen, F. Gouyon","doi":"10.1525/mp.2022.39.5.503","DOIUrl":"https://doi.org/10.1525/mp.2022.39.5.503","url":null,"abstract":"Music often evokes a regular beat and a pleasurable sensation of wanting to move to that beat called groove. Recent studies show that a rhythmic pattern’s ability to evoke groove increases at moderate levels of syncopation, essentially, when some notes occur earlier than expected. We present two studies that investigate that effect of syncopation in more realistic polyphonic music examples. First, listeners rated their urge to move to music excerpts transcribed from funk and rock songs, and to algorithmically transformed versions of these excerpts: 1) with the original syncopation removed, and 2) with various levels of pseudorandom syncopation introduced. While the original excerpts were rated higher than the de-syncopated, the algorithmic syncopation was not as successful in evoking groove. Consequently, a moderate level of syncopation increases groove, but only for certain syncopation patterns. The second study provides detailed comparisons of the original and transformed rhythmic structures that revealed key differences between them in: 1) the distribution of syncopation across instruments and metrical positions, 2) the counter-meter figures formed by the syncopating notes, and 3) the number of pickup notes. On this basis, we form four concrete hypotheses about the function of syncopation in groove, to be tested in future experiments.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67421613","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2022-04-01DOI: 10.1525/mp.2022.39.4.386
C. Corcoran, Jan Stupacher, P. Vuust
{"title":"Swinging the Score? Swing Phrasing Cannot Be Communicated via Explicit Notation Instructions Alone","authors":"C. Corcoran, Jan Stupacher, P. Vuust","doi":"10.1525/mp.2022.39.4.386","DOIUrl":"https://doi.org/10.1525/mp.2022.39.4.386","url":null,"abstract":"Jazz musicians usually learn to play with “swing” phrasing by playing by ear. Classical musicians—who play more from musical scores than by ear—are reported to struggle with producing swing. We explored whether classical musicians play with more swing when performing from more detailed swing notation. Thereby we investigated whether a culturally specific improvisational social procedure can be scripted in detailed music notation for musicians from a different performance background. Twenty classical musicians sight-read jazz tunes from three styles of notation, each with a different level of notational complexity. Experienced jazz listeners evaluated the performances. Results showed that more score-independent classical musicians with strong aural abilities played with equally strong swing regardless of notation; more score-dependent musicians swung most with the medium-complexity classical notation. The data suggest that some higher-level swing features, such as appropriate articulation, event durations, and deviations from a beat sequence can be communicated to a limited extent using written instructions. However, their successful implementation in performance depends on matching instructional complexity to a musician’s skill at decoding and interpreting unfamiliar information. This link between decoding skills and cross-cultural performance makes our findings relevant to ethnological and musicological studies of musical communication processes and perception-action coupling.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2022-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44031295","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}