IMMA-Emo: A Multimodal Interface for Visualising Score- and Audio-synchronised Emotion Annotations

Dorien Herremans, Simin Yang, C. Chuan, M. Barthet, E. Chew
{"title":"IMMA-Emo: A Multimodal Interface for Visualising Score- and Audio-synchronised Emotion Annotations","authors":"Dorien Herremans, Simin Yang, C. Chuan, M. Barthet, E. Chew","doi":"10.1145/3123514.3123545","DOIUrl":null,"url":null,"abstract":"Emotional response to music is often represented on a two-dimensional arousal-valence space without reference to score information that may provide critical cues to explain the observed data. To bridge this gap, we present IMMA-Emo, an integrated software system for visualising emotion data aligned with music audio and score, so as to provide an intuitive way to interactively visualise and analyse music emotion data. The visual interface also allows for the comparison of multiple emotion time series. The IMMA-Emo system builds on the online interactive Multi-modal Music Analysis (IMMA) system. Two examples demonstrating the capabilities of the IMMA-Emo system are drawn from an experiment set up to collect arousal-valence ratings based on participants' perceived emotions during a live performance. Direct observation of corresponding score parts and aural input from the recording allow explanatory factors to be identified for the ratings and changes in the ratings.","PeriodicalId":282371,"journal":{"name":"Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3123514.3123545","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

Abstract

Emotional response to music is often represented on a two-dimensional arousal-valence space without reference to score information that may provide critical cues to explain the observed data. To bridge this gap, we present IMMA-Emo, an integrated software system for visualising emotion data aligned with music audio and score, so as to provide an intuitive way to interactively visualise and analyse music emotion data. The visual interface also allows for the comparison of multiple emotion time series. The IMMA-Emo system builds on the online interactive Multi-modal Music Analysis (IMMA) system. Two examples demonstrating the capabilities of the IMMA-Emo system are drawn from an experiment set up to collect arousal-valence ratings based on participants' perceived emotions during a live performance. Direct observation of corresponding score parts and aural input from the recording allow explanatory factors to be identified for the ratings and changes in the ratings.
IMMA-Emo:一个多模态界面,用于可视化乐谱和音频同步的情感注释
对音乐的情绪反应通常在二维唤醒效价空间中表现出来,而不参考乐谱信息,这些信息可能为解释观察到的数据提供关键线索。为了弥补这一差距,我们提出了IMMA-Emo,这是一个集成的软件系统,用于将音乐音频和乐谱的情感数据可视化,从而提供一种直观的方式来交互式地可视化和分析音乐情感数据。视觉界面还允许对多个情绪时间序列进行比较。IMMA- emo系统建立在在线交互式多模态音乐分析(IMMA)系统的基础上。两个例子展示了IMMA-Emo系统的能力,这两个例子来自一个实验,该实验是根据参与者在现场表演期间的感知情绪来收集唤醒效价评级的。直接观察相应的评分部分和录音的听觉输入,可以确定评分和评分变化的解释因素。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信