{"title":"A study of danmu: Detecting emotional coherence in music videos through synchronized EEG analysis","authors":"Yuqing Liu , Bu Zhong , Jiaxuan Wang , Yao Song","doi":"10.1016/j.chb.2025.108803","DOIUrl":null,"url":null,"abstract":"<div><div>A novel approach is essential to assess viewers' emotional responses to online music videos, as the emotional coherence between perceived and induced reactions has not been thoroughly explored. This research investigates the relationship between perceived and induced emotional responses to music videos through a unique multimodal framework that integrates electroencephalography (EEG) analysis with natural language processing to examine danmu—user-generated scrolling marquee comments synchronized to specific playback times. Employing a time-synchronized methodology, our deep learning model predicted continuous emotional scores from EEG signals based on danmu sentiment. The findings revealed an over 80 % similarity between the two forms of induced emotional data: EEG-derived emotion curves and danmu sentiment curves across five music videos. We explored periods of divergence by contrasting peak emotional responses during the climaxes of the music, highlighting the significant influence of the multimodal sentiment tone on the alignment between neurophysiological and behavioral emotional trajectories. This study uncovers the coherence between emotion curves derived from EEG and danmu data—a methodology that notably diverges from traditional reliance on self-reports or surveys. The partial consistency observed between perceived and induced emotions, along with the effects of emotional valence and arousal on brain-behavior synchronization, underscores the shared nature of emotions elicited by music videos. Contributing factors include the diversity of emotional experiences and expressions among individuals, as well as the intrinsic rhythmicity within music videos, both of which enhance emotional elicitation.</div></div>","PeriodicalId":48471,"journal":{"name":"Computers in Human Behavior","volume":"174 ","pages":"Article 108803"},"PeriodicalIF":8.9000,"publicationDate":"2025-09-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers in Human Behavior","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S074756322500250X","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0
Abstract
A novel approach is essential to assess viewers' emotional responses to online music videos, as the emotional coherence between perceived and induced reactions has not been thoroughly explored. This research investigates the relationship between perceived and induced emotional responses to music videos through a unique multimodal framework that integrates electroencephalography (EEG) analysis with natural language processing to examine danmu—user-generated scrolling marquee comments synchronized to specific playback times. Employing a time-synchronized methodology, our deep learning model predicted continuous emotional scores from EEG signals based on danmu sentiment. The findings revealed an over 80 % similarity between the two forms of induced emotional data: EEG-derived emotion curves and danmu sentiment curves across five music videos. We explored periods of divergence by contrasting peak emotional responses during the climaxes of the music, highlighting the significant influence of the multimodal sentiment tone on the alignment between neurophysiological and behavioral emotional trajectories. This study uncovers the coherence between emotion curves derived from EEG and danmu data—a methodology that notably diverges from traditional reliance on self-reports or surveys. The partial consistency observed between perceived and induced emotions, along with the effects of emotional valence and arousal on brain-behavior synchronization, underscores the shared nature of emotions elicited by music videos. Contributing factors include the diversity of emotional experiences and expressions among individuals, as well as the intrinsic rhythmicity within music videos, both of which enhance emotional elicitation.
期刊介绍:
Computers in Human Behavior is a scholarly journal that explores the psychological aspects of computer use. It covers original theoretical works, research reports, literature reviews, and software and book reviews. The journal examines both the use of computers in psychology, psychiatry, and related fields, and the psychological impact of computer use on individuals, groups, and society. Articles discuss topics such as professional practice, training, research, human development, learning, cognition, personality, and social interactions. It focuses on human interactions with computers, considering the computer as a medium through which human behaviors are shaped and expressed. Professionals interested in the psychological aspects of computer use will find this journal valuable, even with limited knowledge of computers.