Perceptual rhythm determination of music signal for emotion-based classification

Bee Yong Chua, Guojun Lu
{"title":"Perceptual rhythm determination of music signal for emotion-based classification","authors":"Bee Yong Chua, Guojun Lu","doi":"10.1109/MMMC.2006.1651295","DOIUrl":null,"url":null,"abstract":"Music information retrieval (MIR) systems able to classify and retrieve different emotional expression in music pieces are still in its infancy. The challenge is on automatically extracting the perceptual features from music signal. Three rhythmic features that have influence on perceived emotional expression in music are: the tempo (fast/slow), the articulation (staccato/legato) and the motion (firm/flowing, where firm event is mainly evoked by the variation of loudness among events or by the durational variation between events). So far, only parts of the rhythmic features were extracted and used for emotion classification. As a result, either the classification result was not satisfactory or only broad classification of emotion was achieved. In this paper, we propose efficient and effective algorithms to determine these three rhythmic features based on both the findings from music psychology and psycho acoustics researches. Experimental results, with polyphonic music extracts mainly from CD recordings, have shown that our proposed algorithms are effective in determining these three rhythmic features","PeriodicalId":107275,"journal":{"name":"2006 12th International Multi-Media Modelling Conference","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2006 12th International Multi-Media Modelling Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MMMC.2006.1651295","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

Music information retrieval (MIR) systems able to classify and retrieve different emotional expression in music pieces are still in its infancy. The challenge is on automatically extracting the perceptual features from music signal. Three rhythmic features that have influence on perceived emotional expression in music are: the tempo (fast/slow), the articulation (staccato/legato) and the motion (firm/flowing, where firm event is mainly evoked by the variation of loudness among events or by the durational variation between events). So far, only parts of the rhythmic features were extracted and used for emotion classification. As a result, either the classification result was not satisfactory or only broad classification of emotion was achieved. In this paper, we propose efficient and effective algorithms to determine these three rhythmic features based on both the findings from music psychology and psycho acoustics researches. Experimental results, with polyphonic music extracts mainly from CD recordings, have shown that our proposed algorithms are effective in determining these three rhythmic features
基于情感分类的音乐信号感知节奏确定
音乐信息检索(MIR)系统能够对音乐作品中不同的情感表达进行分类和检索,目前还处于起步阶段。挑战在于如何从音乐信号中自动提取感知特征。影响音乐中感知到的情感表达的三个节奏特征是:节奏(快/慢),发音(断/连音)和动作(坚定/流畅,其中坚定事件主要由事件之间的响度变化或事件之间的持续时间变化引起)。到目前为止,只提取了部分节奏特征并用于情感分类。结果,要么分类结果不理想,要么只对情绪进行了宽泛的分类。本文在音乐心理学和心理声学研究的基础上,提出了确定这三种节奏特征的高效算法。实验结果表明,我们提出的算法在确定这三种节奏特征方面是有效的
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信