A study of danmu: Detecting emotional coherence in music videos through synchronized EEG analysis

IF 8.9 1区 心理学 Q1 PSYCHOLOGY, EXPERIMENTAL
Yuqing Liu , Bu Zhong , Jiaxuan Wang , Yao Song
{"title":"A study of danmu: Detecting emotional coherence in music videos through synchronized EEG analysis","authors":"Yuqing Liu ,&nbsp;Bu Zhong ,&nbsp;Jiaxuan Wang ,&nbsp;Yao Song","doi":"10.1016/j.chb.2025.108803","DOIUrl":null,"url":null,"abstract":"<div><div>A novel approach is essential to assess viewers' emotional responses to online music videos, as the emotional coherence between perceived and induced reactions has not been thoroughly explored. This research investigates the relationship between perceived and induced emotional responses to music videos through a unique multimodal framework that integrates electroencephalography (EEG) analysis with natural language processing to examine danmu—user-generated scrolling marquee comments synchronized to specific playback times. Employing a time-synchronized methodology, our deep learning model predicted continuous emotional scores from EEG signals based on danmu sentiment. The findings revealed an over 80 % similarity between the two forms of induced emotional data: EEG-derived emotion curves and danmu sentiment curves across five music videos. We explored periods of divergence by contrasting peak emotional responses during the climaxes of the music, highlighting the significant influence of the multimodal sentiment tone on the alignment between neurophysiological and behavioral emotional trajectories. This study uncovers the coherence between emotion curves derived from EEG and danmu data—a methodology that notably diverges from traditional reliance on self-reports or surveys. The partial consistency observed between perceived and induced emotions, along with the effects of emotional valence and arousal on brain-behavior synchronization, underscores the shared nature of emotions elicited by music videos. Contributing factors include the diversity of emotional experiences and expressions among individuals, as well as the intrinsic rhythmicity within music videos, both of which enhance emotional elicitation.</div></div>","PeriodicalId":48471,"journal":{"name":"Computers in Human Behavior","volume":"174 ","pages":"Article 108803"},"PeriodicalIF":8.9000,"publicationDate":"2025-09-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers in Human Behavior","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S074756322500250X","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0

Abstract

A novel approach is essential to assess viewers' emotional responses to online music videos, as the emotional coherence between perceived and induced reactions has not been thoroughly explored. This research investigates the relationship between perceived and induced emotional responses to music videos through a unique multimodal framework that integrates electroencephalography (EEG) analysis with natural language processing to examine danmu—user-generated scrolling marquee comments synchronized to specific playback times. Employing a time-synchronized methodology, our deep learning model predicted continuous emotional scores from EEG signals based on danmu sentiment. The findings revealed an over 80 % similarity between the two forms of induced emotional data: EEG-derived emotion curves and danmu sentiment curves across five music videos. We explored periods of divergence by contrasting peak emotional responses during the climaxes of the music, highlighting the significant influence of the multimodal sentiment tone on the alignment between neurophysiological and behavioral emotional trajectories. This study uncovers the coherence between emotion curves derived from EEG and danmu data—a methodology that notably diverges from traditional reliance on self-reports or surveys. The partial consistency observed between perceived and induced emotions, along with the effects of emotional valence and arousal on brain-behavior synchronization, underscores the shared nature of emotions elicited by music videos. Contributing factors include the diversity of emotional experiences and expressions among individuals, as well as the intrinsic rhythmicity within music videos, both of which enhance emotional elicitation.
基于同步脑电图分析的音乐视频情感连贯性研究
一种新的方法对于评估观众对在线音乐视频的情感反应至关重要,因为感知和诱导反应之间的情感一致性尚未得到彻底的探索。本研究通过一个独特的多模态框架,将脑电图(EEG)分析与自然语言处理相结合,研究了感知和诱导对音乐视频的情绪反应之间的关系,以检查danmu用户生成的滚动字幕评论与特定播放时间同步。采用时间同步的方法,我们的深度学习模型根据脑电图信号预测基于丹木情绪的连续情绪评分。研究结果显示,两种形式的诱发情绪数据之间有超过80%的相似性:脑电图衍生的情绪曲线和五部音乐视频中的danmu情绪曲线。我们通过对比音乐高潮时的情绪反应高峰来探索分歧时期,强调了多模态情绪音调对神经生理和行为情绪轨迹之间一致性的重要影响。这项研究揭示了从脑电图和丹木数据中得出的情绪曲线之间的一致性,这种方法明显不同于传统的依赖自我报告或调查的方法。观察到的感知情绪和诱导情绪之间的部分一致性,以及情绪效价和觉醒对大脑行为同步的影响,强调了音乐视频引发的情绪的共同本质。影响因素包括个人情感体验和表达的多样性,以及音乐视频内在的节奏性,这两者都能增强情感的激发。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
19.10
自引率
4.00%
发文量
381
审稿时长
40 days
期刊介绍: Computers in Human Behavior is a scholarly journal that explores the psychological aspects of computer use. It covers original theoretical works, research reports, literature reviews, and software and book reviews. The journal examines both the use of computers in psychology, psychiatry, and related fields, and the psychological impact of computer use on individuals, groups, and society. Articles discuss topics such as professional practice, training, research, human development, learning, cognition, personality, and social interactions. It focuses on human interactions with computers, considering the computer as a medium through which human behaviors are shaped and expressed. Professionals interested in the psychological aspects of computer use will find this journal valuable, even with limited knowledge of computers.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信