Guimin Hu, Yi Xin, Weimin Lyu, Haojian Huang, Chang Sun, Zhihong Zhu, Lin Gui, Ruichu Cai
{"title":"多模态情感计算的最新趋势:从 NLP 角度进行的调查","authors":"Guimin Hu, Yi Xin, Weimin Lyu, Haojian Huang, Chang Sun, Zhihong Zhu, Lin Gui, Ruichu Cai","doi":"arxiv-2409.07388","DOIUrl":null,"url":null,"abstract":"Multimodal affective computing (MAC) has garnered increasing attention due to\nits broad applications in analyzing human behaviors and intentions, especially\nin text-dominated multimodal affective computing field. This survey presents\nthe recent trends of multimodal affective computing from NLP perspective\nthrough four hot tasks: multimodal sentiment analysis, multimodal emotion\nrecognition in conversation, multimodal aspect-based sentiment analysis and\nmultimodal multi-label emotion recognition. The goal of this survey is to\nexplore the current landscape of multimodal affective research, identify\ndevelopment trends, and highlight the similarities and differences across\nvarious tasks, offering a comprehensive report on the recent progress in\nmultimodal affective computing from an NLP perspective. This survey covers the\nformalization of tasks, provides an overview of relevant works, describes\nbenchmark datasets, and details the evaluation metrics for each task.\nAdditionally, it briefly discusses research in multimodal affective computing\ninvolving facial expressions, acoustic signals, physiological signals, and\nemotion causes. Additionally, we discuss the technical approaches, challenges,\nand future directions in multimodal affective computing. To support further\nresearch, we released a repository that compiles related works in multimodal\naffective computing, providing detailed resources and references for the\ncommunity.","PeriodicalId":501030,"journal":{"name":"arXiv - CS - Computation and Language","volume":"23 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Recent Trends of Multimodal Affective Computing: A Survey from NLP Perspective\",\"authors\":\"Guimin Hu, Yi Xin, Weimin Lyu, Haojian Huang, Chang Sun, Zhihong Zhu, Lin Gui, Ruichu Cai\",\"doi\":\"arxiv-2409.07388\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Multimodal affective computing (MAC) has garnered increasing attention due to\\nits broad applications in analyzing human behaviors and intentions, especially\\nin text-dominated multimodal affective computing field. This survey presents\\nthe recent trends of multimodal affective computing from NLP perspective\\nthrough four hot tasks: multimodal sentiment analysis, multimodal emotion\\nrecognition in conversation, multimodal aspect-based sentiment analysis and\\nmultimodal multi-label emotion recognition. The goal of this survey is to\\nexplore the current landscape of multimodal affective research, identify\\ndevelopment trends, and highlight the similarities and differences across\\nvarious tasks, offering a comprehensive report on the recent progress in\\nmultimodal affective computing from an NLP perspective. This survey covers the\\nformalization of tasks, provides an overview of relevant works, describes\\nbenchmark datasets, and details the evaluation metrics for each task.\\nAdditionally, it briefly discusses research in multimodal affective computing\\ninvolving facial expressions, acoustic signals, physiological signals, and\\nemotion causes. Additionally, we discuss the technical approaches, challenges,\\nand future directions in multimodal affective computing. To support further\\nresearch, we released a repository that compiles related works in multimodal\\naffective computing, providing detailed resources and references for the\\ncommunity.\",\"PeriodicalId\":501030,\"journal\":{\"name\":\"arXiv - CS - Computation and Language\",\"volume\":\"23 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Computation and Language\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.07388\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Computation and Language","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.07388","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Recent Trends of Multimodal Affective Computing: A Survey from NLP Perspective
Multimodal affective computing (MAC) has garnered increasing attention due to
its broad applications in analyzing human behaviors and intentions, especially
in text-dominated multimodal affective computing field. This survey presents
the recent trends of multimodal affective computing from NLP perspective
through four hot tasks: multimodal sentiment analysis, multimodal emotion
recognition in conversation, multimodal aspect-based sentiment analysis and
multimodal multi-label emotion recognition. The goal of this survey is to
explore the current landscape of multimodal affective research, identify
development trends, and highlight the similarities and differences across
various tasks, offering a comprehensive report on the recent progress in
multimodal affective computing from an NLP perspective. This survey covers the
formalization of tasks, provides an overview of relevant works, describes
benchmark datasets, and details the evaluation metrics for each task.
Additionally, it briefly discusses research in multimodal affective computing
involving facial expressions, acoustic signals, physiological signals, and
emotion causes. Additionally, we discuss the technical approaches, challenges,
and future directions in multimodal affective computing. To support further
research, we released a repository that compiles related works in multimodal
affective computing, providing detailed resources and references for the
community.