Minglan Li , Yipeng Yu , Xu Liu , Junqing Wu , Qiandong Wang , Yueqin Hu
{"title":"超越算法:利用多模态情感和行为线索作为短视频消费的新预测因素","authors":"Minglan Li , Yipeng Yu , Xu Liu , Junqing Wu , Qiandong Wang , Yueqin Hu","doi":"10.1016/j.chbr.2025.100805","DOIUrl":null,"url":null,"abstract":"<div><div>The short-video industry has experienced rapid growth in recent years, largely driven by advanced content recommendation systems. While much of the existing research has concentrated on algorithmic improvements, the psychological factors influencing viewing behaviors remain underexplored. This study aims to address this gap by incorporating users' emotional and behavioral indicators into the prediction of short-video viewing behavior. Study 1 was conducted in a controlled laboratory setting, where participants viewed videos on a computer screen while their physiological activity (including electrocardiography and electrodermal activity) was recorded as an objective measure of emotional responses. After viewing the video, participants self-reported their emotions and viewing preferences. Employing a variety of machine learning techniques, we found that both self-reported and physiologically measured emotions were strong predictors of viewing behaviors, with a predictive accuracy exceeding 72 %. Study 2 aimed to enhance ecological validity by having participants view videos on mobile phones, enabling them to swipe between videos as they would in a typical short-video app. Using physiological signals and mobile edge data (including swipe gestures and gyroscope signals), the predictive accuracy for actual viewing behavior reached 82 %. Additionally, a substantial portion of variance in edge signals could be explained by physiological signals. These findings provide valuable insights into the psychological drivers of short video viewing behavior and present a novel, non-intrusive approach to incorporate users’ real-time experiences into content recommendation systems.</div></div>","PeriodicalId":72681,"journal":{"name":"Computers in human behavior reports","volume":"20 ","pages":"Article 100805"},"PeriodicalIF":5.8000,"publicationDate":"2025-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Beyond algorithms: Utilizing multi-modal emotional and behavioral cues as novel predictors of short-video consumption\",\"authors\":\"Minglan Li , Yipeng Yu , Xu Liu , Junqing Wu , Qiandong Wang , Yueqin Hu\",\"doi\":\"10.1016/j.chbr.2025.100805\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>The short-video industry has experienced rapid growth in recent years, largely driven by advanced content recommendation systems. While much of the existing research has concentrated on algorithmic improvements, the psychological factors influencing viewing behaviors remain underexplored. This study aims to address this gap by incorporating users' emotional and behavioral indicators into the prediction of short-video viewing behavior. Study 1 was conducted in a controlled laboratory setting, where participants viewed videos on a computer screen while their physiological activity (including electrocardiography and electrodermal activity) was recorded as an objective measure of emotional responses. After viewing the video, participants self-reported their emotions and viewing preferences. Employing a variety of machine learning techniques, we found that both self-reported and physiologically measured emotions were strong predictors of viewing behaviors, with a predictive accuracy exceeding 72 %. Study 2 aimed to enhance ecological validity by having participants view videos on mobile phones, enabling them to swipe between videos as they would in a typical short-video app. Using physiological signals and mobile edge data (including swipe gestures and gyroscope signals), the predictive accuracy for actual viewing behavior reached 82 %. Additionally, a substantial portion of variance in edge signals could be explained by physiological signals. These findings provide valuable insights into the psychological drivers of short video viewing behavior and present a novel, non-intrusive approach to incorporate users’ real-time experiences into content recommendation systems.</div></div>\",\"PeriodicalId\":72681,\"journal\":{\"name\":\"Computers in human behavior reports\",\"volume\":\"20 \",\"pages\":\"Article 100805\"},\"PeriodicalIF\":5.8000,\"publicationDate\":\"2025-09-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers in human behavior reports\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2451958825002209\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, EXPERIMENTAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers in human behavior reports","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2451958825002209","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
Beyond algorithms: Utilizing multi-modal emotional and behavioral cues as novel predictors of short-video consumption
The short-video industry has experienced rapid growth in recent years, largely driven by advanced content recommendation systems. While much of the existing research has concentrated on algorithmic improvements, the psychological factors influencing viewing behaviors remain underexplored. This study aims to address this gap by incorporating users' emotional and behavioral indicators into the prediction of short-video viewing behavior. Study 1 was conducted in a controlled laboratory setting, where participants viewed videos on a computer screen while their physiological activity (including electrocardiography and electrodermal activity) was recorded as an objective measure of emotional responses. After viewing the video, participants self-reported their emotions and viewing preferences. Employing a variety of machine learning techniques, we found that both self-reported and physiologically measured emotions were strong predictors of viewing behaviors, with a predictive accuracy exceeding 72 %. Study 2 aimed to enhance ecological validity by having participants view videos on mobile phones, enabling them to swipe between videos as they would in a typical short-video app. Using physiological signals and mobile edge data (including swipe gestures and gyroscope signals), the predictive accuracy for actual viewing behavior reached 82 %. Additionally, a substantial portion of variance in edge signals could be explained by physiological signals. These findings provide valuable insights into the psychological drivers of short video viewing behavior and present a novel, non-intrusive approach to incorporate users’ real-time experiences into content recommendation systems.