{"title":"A Whole Brain EEG Analysis of Musicianship","authors":"Estela Ribeiro, C. Thomaz","doi":"10.1525/mp.2019.37.1.42","DOIUrl":null,"url":null,"abstract":"The neural activation patterns provoked in response to music listening can reveal whether a subject did or did not receive music training. In the current exploratory study, we have approached this two-group (musicians and nonmusicians) classification problem through a computational framework composed of the following steps: Acoustic features extraction; Acoustic features selection; Trigger selection; EEG signal processing; and Multivariate statistical analysis. We are particularly interested in analyzing the brain data on a global level, considering its activity registered in electroencephalogram (EEG) signals on a given time instant. Our experiment's results—with 26 volunteers (13 musicians and 13 nonmusicians) who listened the classical music Hungarian Dance No. 5 from Johannes Brahms—have shown that is possible to linearly differentiate musicians and nonmusicians with classification accuracies that range from 69.2% (test set) to 93.8% (training set), despite the limited sample sizes available. Additionally, given the whole brain vector navigation method described and implemented here, our results suggest that it is possible to highlight the most expressive and discriminant changes in the participants brain activity patterns depending on the acoustic feature extracted from the audio.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":1.3000,"publicationDate":"2019-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1525/mp.2019.37.1.42","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Music Perception","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1525/mp.2019.37.1.42","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"MUSIC","Score":null,"Total":0}
引用次数: 3
Abstract
The neural activation patterns provoked in response to music listening can reveal whether a subject did or did not receive music training. In the current exploratory study, we have approached this two-group (musicians and nonmusicians) classification problem through a computational framework composed of the following steps: Acoustic features extraction; Acoustic features selection; Trigger selection; EEG signal processing; and Multivariate statistical analysis. We are particularly interested in analyzing the brain data on a global level, considering its activity registered in electroencephalogram (EEG) signals on a given time instant. Our experiment's results—with 26 volunteers (13 musicians and 13 nonmusicians) who listened the classical music Hungarian Dance No. 5 from Johannes Brahms—have shown that is possible to linearly differentiate musicians and nonmusicians with classification accuracies that range from 69.2% (test set) to 93.8% (training set), despite the limited sample sizes available. Additionally, given the whole brain vector navigation method described and implemented here, our results suggest that it is possible to highlight the most expressive and discriminant changes in the participants brain activity patterns depending on the acoustic feature extracted from the audio.
期刊介绍:
Music Perception charts the ongoing scholarly discussion and study of musical phenomena. Publishing original empirical and theoretical papers, methodological articles and critical reviews from renowned scientists and musicians, Music Perception is a repository of insightful research. The broad range of disciplines covered in the journal includes: •Psychology •Psychophysics •Linguistics •Neurology •Neurophysiology •Artificial intelligence •Computer technology •Physical and architectural acoustics •Music theory