Music PerceptionPub Date : 2019-12-01DOI: 10.1525/mp.2019.37.2.95
Rosalie Ollivier, L. Goupil, M. Liuni, J. Aucouturier
{"title":"Enjoy The Violence","authors":"Rosalie Ollivier, L. Goupil, M. Liuni, J. Aucouturier","doi":"10.1525/mp.2019.37.2.95","DOIUrl":"https://doi.org/10.1525/mp.2019.37.2.95","url":null,"abstract":"Traditional neurobiological theories of musical emotions explain well why extreme music such as punk, hardcore or metal, whose vocal and instrumental characteristics share much similarity with acoustic threat signals, should evoke unpleasant feelings for a large proportion of listeners. Why it doesn't for metal music fans, however, remains a theoretical challenge: metal fans may differ from non-fans in how they process acoustic threat signals at the sub-cortical level, showing deactivated or reconditioned responses that differ from controls. Alternatively, it is also possible that appreciation for metal depends on the inhibition by cortical circuits of a normal low-order response to auditory threat. In a series of three experiments, we show here that, at a sensory level, metal fans actually react equally negatively, equally fast and even more accurately to cues of auditory threat in vocal and instrumental contexts than non-fans. Conversely, cognitive load somewhat appears to reduce fans' appreciation of metal to the level reported by non-fans. Taken together, these results are not compatible with the idea that extreme music lovers do so because of a different low-level response to threat, but rather, highlight a critical contribution of higher-order cognition to the aesthetic experience. These results are discussed in the light of recent higher-order theories of emotional consciousness, which we argue should be generalized to the emotional experience of music across musical genres.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1525/mp.2019.37.2.95","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44188090","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2019-12-01DOI: 10.1525/mp.2019.37.2.147
Aviel Sulem, E. Bodner, N. Amir
{"title":"Perception-Based Classification of Expressive Musical Terms","authors":"Aviel Sulem, E. Bodner, N. Amir","doi":"10.1525/mp.2019.37.2.147","DOIUrl":"https://doi.org/10.1525/mp.2019.37.2.147","url":null,"abstract":"Expressive Musical Terms (EMTs) are commonly used by composers as verbal descriptions of musical expressiveness and characters that performers are requested to convey. We suggest a classification of 55 of these terms, based on the perception of professional music performers who were asked to: 1) organize the considered EMTs in a two-dimensional plane in such a way that proximity reflects similarity; and 2) rate these EMTs according to valence, arousal, extraversion, and neuroticism, using 7-level Likert scales. Using a minimization procedure, we found that a satisfactory partition requires these EMTs to be organized in four clusters (whose centroids are associated with tenderness, happiness, anger, and sadness) located in the four quarters of the valence-arousal plane of the circumplex model of affect developed by Russell (1980). In terms of the related positive-negative activation parameters, introduced by Watson and Tellegen (1985), we obtained a significant correlation between positive activation and extraversion and between negative activation and neuroticism. This demonstrates that these relations, previously observed in personality studies by Watson & Clark (1992a), extend to the musical field.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1525/mp.2019.37.2.147","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67421551","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2019-09-01DOI: 10.1525/mp.2019.37.1.42
Estela Ribeiro, C. Thomaz
{"title":"A Whole Brain EEG Analysis of Musicianship","authors":"Estela Ribeiro, C. Thomaz","doi":"10.1525/mp.2019.37.1.42","DOIUrl":"https://doi.org/10.1525/mp.2019.37.1.42","url":null,"abstract":"The neural activation patterns provoked in response to music listening can reveal whether a subject did or did not receive music training. In the current exploratory study, we have approached this two-group (musicians and nonmusicians) classification problem through a computational framework composed of the following steps: Acoustic features extraction; Acoustic features selection; Trigger selection; EEG signal processing; and Multivariate statistical analysis. We are particularly interested in analyzing the brain data on a global level, considering its activity registered in electroencephalogram (EEG) signals on a given time instant. Our experiment's results—with 26 volunteers (13 musicians and 13 nonmusicians) who listened the classical music Hungarian Dance No. 5 from Johannes Brahms—have shown that is possible to linearly differentiate musicians and nonmusicians with classification accuracies that range from 69.2% (test set) to 93.8% (training set), despite the limited sample sizes available. Additionally, given the whole brain vector navigation method described and implemented here, our results suggest that it is possible to highlight the most expressive and discriminant changes in the participants brain activity patterns depending on the acoustic feature extracted from the audio.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2019-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1525/mp.2019.37.1.42","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41294233","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2019-09-01DOI: 10.1525/mp.2019.37.1.66
A. Battcock, Michael Schutz
{"title":"Acoustically Expressing Affect","authors":"A. Battcock, Michael Schutz","doi":"10.1525/mp.2019.37.1.66","DOIUrl":"https://doi.org/10.1525/mp.2019.37.1.66","url":null,"abstract":"Composers convey emotion through music by co-varying structural cues. Although the complex interplay provides a rich listening experience, this creates challenges for understanding the contributions of individual cues. Here we investigate how three specific cues (attack rate, mode, and pitch height) work together to convey emotion in Bach's Well Tempered-Clavier (WTC). In three experiments, we explore responses to (1) eight-measure excerpts and (2) musically “resolved” excerpts, and (3) investigate the role of different standard dimensional scales of emotion. In each experiment, thirty nonmusician participants rated perceived emotion along scales of valence and intensity (Experiments 1 & 2) or valence and arousal (Experiment 3) for 48 pieces in the WTC. Responses indicate listeners used attack rate, Mode, and pitch height to make judgements of valence, but only attack rate for intensity/arousal. Commonality analyses revealed mode predicted the most variance for valence ratings, followed by attack rate, with pitch height contributing minimally. In Experiment 2 mode increased in predictive power compared to Experiment 1. For Experiment 3, using “arousal” instead of “intensity” showed similar results to Experiment 1. We discuss how these results complement and extend previous findings of studies with tightly controlled stimuli, providing additional perspective on complex issues of interpersonal communication.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2019-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1525/mp.2019.37.1.66","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42956242","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2019-09-01DOI: 10.1525/mp.2019.37.1.26
Justin M. London, Birgitta Burger, Marc R. Thompson, Molly Hildreth, J. Wilson, Nick Schally, P. Toiviainen
{"title":"Motown, Disco, and Drumming","authors":"Justin M. London, Birgitta Burger, Marc R. Thompson, Molly Hildreth, J. Wilson, Nick Schally, P. Toiviainen","doi":"10.1525/mp.2019.37.1.26","DOIUrl":"https://doi.org/10.1525/mp.2019.37.1.26","url":null,"abstract":"In a study of tempo perception, London, Burger, Thompson, and Toiviainen (2016) presented participants with digitally ‘‘tempo-shifted’’ R&B songs (i.e., sped up or slowed down without otherwise altering their pitch or timbre). They found that while participants’ relative tempo judgments of original versus altered versions were correct, they no longer corresponded to the beat rate of each stimulus. Here we report on three experiments that further probe the relation(s) between beat rate, tempo-shifting, beat salience, melodic structure, and perceived tempo. Experiment 1 is a replication of London et al. (2016) using the original stimuli. Experiment 2 replaces the Motown stimuli with disco music, which has higher beat salience. Experiment 3 uses looped drum patterns, eliminating pitch and other cues from the stimuli and maximizing beat salience. The effect of London et al. (2016) was replicated in Experiment 1, present to a lesser degree in Experiment 2, and absent in Experiment 3. Experiments 2 and 3 also found that participants were able to make tempo judgments in accordance with BPM rates for stimuli that were not tempo-shifted. The roles of beat salience, melodic structure, and memory for tempo are discussed, and the TAE as an example of perceptual sharpening is considered.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2019-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45427954","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2019-06-01DOI: 10.1525/MP.2019.36.5.468
A. Pereira, Helena Rodrigues
{"title":"The Relationship Between Portuguese Children's Use of Singing Voice and Singing Accuracy when Singing with Text and a Neutral Syllable","authors":"A. Pereira, Helena Rodrigues","doi":"10.1525/MP.2019.36.5.468","DOIUrl":"https://doi.org/10.1525/MP.2019.36.5.468","url":null,"abstract":"The purpose of this study was to investigate the relationship between Portuguese children's use of singing voice and their singing accuracy on the pitches belonging to the Singing Voice Development Measure (SVDM) criterion patterns (Rutkowski, 2015), as well as the influence on singing with a neutral syllable or text on both variables. Children aged 4 to 9 (n = 137) were administered the SVDM individually and three raters evaluated recordings of the children's singing, both for the use of singing voice (i.e., effective use of pitch range and register) and singing accuracy. Prior to data analysis, the validity and reliability of the measure was examined and assured. A significant relationship was found between both variables. Favoring the neutral syllable, significant differences were found in response mode for singing accuracy, but not for use of singing voice, suggesting that the use of neutral syllable in classroom singing activities might be beneficial to improve accuracy. Older children and girls obtained higher scores for the use of singing voice and accuracy. Within a common pitch range, children with higher SVDM scores sang accurately a higher number of pitches, suggesting that expanding children's use of singing voice might also improve singing accuracy.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2019-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1525/MP.2019.36.5.468","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42147333","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2019-06-01DOI: 10.1525/MP.2019.36.5.457
J. D. Zhang, Emery Schubert
{"title":"A Single Item Measure for Identifying Musician and Nonmusician Categories Based on Measures of Musical Sophistication","authors":"J. D. Zhang, Emery Schubert","doi":"10.1525/MP.2019.36.5.457","DOIUrl":"https://doi.org/10.1525/MP.2019.36.5.457","url":null,"abstract":"Musicians are typically identified in research papers by some single item measure (SIM) that focuses on just one component of musicality, such as expertise. Recently, musical sophistication has emerged as a more comprehensive approach by incorporating various components using multiple question items. However, the practice of SIM continues. The aim of this paper was to investigate which SIM in musical sophistication indexes best estimates musical sophistication. The Ollen Musical Sophistication Index (OMSI) and the Goldsmiths Musical Sophistication Index (Gold-MSI) were analyzed. The OMSI musician rank item (“Which title best describes you?”) was observed to be the best SIM for predicting OMSI and Gold-MSI scores. Analysis of the OMSI item indicated three parsimonious musical identity categories (MIC); namely, no musical identity (NMI), musical identity (MI), and strong musical identity (SMI). Further analyses of MIC against common SIMs used in literature showed characteristic profiles. For example, MIC membership according to years of private lessons are: NMI is < 6 years; MI is 6–10 years; and SMI is > 10 years. The finding of the study is that the SIM of musician rank should be used because of its face validity, correlation with musical sophistication, and plausible demarcation into the three MIC levels.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2019-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1525/MP.2019.36.5.457","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42155719","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2019-06-01DOI: 10.1525/MP.2019.36.5.480
Paolo Ammirante, Fran Copelli
{"title":"Vowel Formant Structure Predicts Metric Position in Hip-hop Lyrics","authors":"Paolo Ammirante, Fran Copelli","doi":"10.1525/MP.2019.36.5.480","DOIUrl":"https://doi.org/10.1525/MP.2019.36.5.480","url":null,"abstract":"In order to be heard over the low-frequency energy of a loud orchestra, opera singers adjust their vocal tracts to increase high-frequency energy around 3,000 Hz (known as a “singer's formant”). In rap music, rhymes often coincide with the beat and thus may be masked by loud, low-frequency percussion events. How do emcees (i.e., rappers) avoid masking of on-beat rhymes? If emcees exploit formant structure, this may be reflected in the distribution of on- and off-beat vowels. To test this prediction, we used a sample of words from the MCFlow rap lyric corpus (Condit-Schultz, 2016). Frequency of occurrence of on- and off-beat words was compared. Each word contained one of eight vowel nuclei; population estimates of each vowel's first and second formant (F1 and F2) frequencies were obtained from an existing source. A bias was observed: vowels with higher F2, which are less likely to be masked by percussion, were favored for on-beat words. Words with lower F2 vowels, which may be masked, were more likely to deviate from the beat. Bias was most evident among rhyming words but persisted for nonrhyming words. These findings imply that emcees use formant structure to implicitly or explicitly target the intelligibility of salient lyric events.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2019-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1525/MP.2019.36.5.480","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47113180","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2019-06-01DOI: 10.1525/MP.2019.36.5.435
Kathleen A. Corrigall, L. Trainor
{"title":"Electrophysiological Correlates of Key and Harmony Processing in 3-year-old Children","authors":"Kathleen A. Corrigall, L. Trainor","doi":"10.1525/MP.2019.36.5.435","DOIUrl":"https://doi.org/10.1525/MP.2019.36.5.435","url":null,"abstract":"Infants and children are able to track statistical regularities in perceptual input, which allows them to acquire structural aspects of language and music, such as syntax. However, much more is known about the development of linguistic compared to musical syntax. In the present study, we examined 3.5-year-olds’ implicit knowledge of Western musical pitch structure using electroencephalography (EEG). Event-related potentials (ERPs) were measured while children listened to chord sequences that either 1) followed Western harmony rules, 2) ended on a chord that went outside the key, or 3) ended on an in-key but less expected chord harmonically. Whereas adults tend to show an early right anterior negativity (ERAN) in response to unexpected chords (Koelsch, 2009), 3.5-year-olds in our study showed an immature response that was positive rather than negative in polarity. Our results suggest that very young children exhibit implicit knowledge of the pitch structure of Western music years before they have been shown to demonstrate that knowledge in behavioral tasks.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2019-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1525/MP.2019.36.5.435","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49402679","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Music PerceptionPub Date : 2019-06-01DOI: 10.1525/MP.2019.36.5.448
Douglas A. Kowalewski, R. Friedman, Stan zavoyskiy, W. Neill
{"title":"A Reinvestigation of the Source Dilemma Hypothesis","authors":"Douglas A. Kowalewski, R. Friedman, Stan zavoyskiy, W. Neill","doi":"10.1525/MP.2019.36.5.448","DOIUrl":"https://doi.org/10.1525/MP.2019.36.5.448","url":null,"abstract":"In a recent article, Bonin, Trainor, Belyk, and Andrews (2016) proposed a novel way in which basic processes of auditory perception may influence affective responses to music. According to their source dilemma hypothesis (SDH), the relative fluency of a particular aspect of musical processing—the parsing of the music into distinct audio streams—is hedonically marked: Efficient stream segregation elicits pleasant affective experience whereas inefficient segregation results in unpleasant affective experience, thereby contributing to (dis)preference for a musical stimulus. Bonin et al. (2016) conducted two experiments, the results of which were ostensibly consistent with the SDH. However, their research designs introduced major confounds that undermined the ability of these initial studies to offer unequivocal evidence for their hypothesis. To address this, we conducted a large-scale (N = 311) constructive replication of Bonin et al. (2016; Experiment 2), significantly modifying the design to rectify these methodological shortfalls and thereby better assess the validity of the SDH. Results successfully replicated those of Bonin et al. (2016), although they indicated that source dilemma effects on music preference may be more modest than their original findings would suggest. Unresolved issues and directions for future investigation of the SDH are discussed.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2019-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1525/MP.2019.36.5.448","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48938484","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}