{"title":"Towards Music Information Retrieval driven by EEG signals: Architecture and preliminary experiments","authors":"Yuuko Morita, Hung-Hsuan Huang, K. Kawagoe","doi":"10.1109/ICIS.2013.6607843","DOIUrl":null,"url":null,"abstract":"Although much research on Music Information Retrieval (MIR) has been done in the last decade, the input of the current MIR to specify a user query for finding a similar piece of music is still either by the existing old-fashioned keywords or by music contents. We aim to realize a new type of MIR equipped with brain-computer interfaces using electroencephalogram (EEG) signals. Toward the new MIR, we propose an architecture of MIR driven by EEG signals in this paper. While the architecture contains many issues to be solved, the point of the architecture is to construct user's music query in multi-layered aggregation of EEG signals. We describe in this paper the preliminary experiments conducted for selecting some appropriate low-level features for our multi-layered query construction and matching. It is obtained that the mental states of users while listening to music can be classified with high accuracy by using EEG signal aggregated features. We are starting development of detailed design of the architecture using the results described in the paper.","PeriodicalId":345020,"journal":{"name":"2013 IEEE/ACIS 12th International Conference on Computer and Information Science (ICIS)","volume":"175 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 IEEE/ACIS 12th International Conference on Computer and Information Science (ICIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIS.2013.6607843","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
Although much research on Music Information Retrieval (MIR) has been done in the last decade, the input of the current MIR to specify a user query for finding a similar piece of music is still either by the existing old-fashioned keywords or by music contents. We aim to realize a new type of MIR equipped with brain-computer interfaces using electroencephalogram (EEG) signals. Toward the new MIR, we propose an architecture of MIR driven by EEG signals in this paper. While the architecture contains many issues to be solved, the point of the architecture is to construct user's music query in multi-layered aggregation of EEG signals. We describe in this paper the preliminary experiments conducted for selecting some appropriate low-level features for our multi-layered query construction and matching. It is obtained that the mental states of users while listening to music can be classified with high accuracy by using EEG signal aggregated features. We are starting development of detailed design of the architecture using the results described in the paper.
尽管近十年来人们对音乐信息检索(Music Information Retrieval, MIR)进行了大量的研究,但目前在音乐信息检索中指定用户查询查找相似音乐的方式仍然是通过现有的老式关键词或音乐内容进行输入。我们的目标是利用脑电图(EEG)信号实现一种配备脑机接口的新型MIR。针对这一问题,本文提出了一种由脑电信号驱动的MIR结构。虽然该体系结构中存在许多需要解决的问题,但该体系结构的重点是在脑电信号的多层聚合中构造用户的音乐查询。本文描述了为多层查询的构建和匹配选择合适的底层特征所进行的初步实验。结果表明,利用脑电信号聚合特征可以对用户听音乐时的精神状态进行高准确率的分类。我们正在使用论文中描述的结果开始对体系结构进行详细设计。