Enhancing multi-modal aspect-based sentiment classification via emotional semantic-aware cross-modal relation inference

IF 6.9 1区 管理学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS
Zhaoyu Li, Chen Gong, Guohong Fu
{"title":"Enhancing multi-modal aspect-based sentiment classification via emotional semantic-aware cross-modal relation inference","authors":"Zhaoyu Li,&nbsp;Chen Gong,&nbsp;Guohong Fu","doi":"10.1016/j.ipm.2025.104427","DOIUrl":null,"url":null,"abstract":"<div><div>Multi-modal Aspect-based Sentiment Classification (MASC) determines the sentiment polarity of specific aspects in text–image pairs. Recent research has explored leveraging image–text relevance to improve MASC performance. However, existing approaches primarily focus on explicit alignments between textual aspects and visual objects or on the global relevance between entire texts and images, often overlooking the implicit emotional connections specific to aspects. In this work, we propose an aspect-level emotional cross-modal relation scheme that captures both explicit alignments and implicit emotional connections between text and image. Based on this scheme, we construct a new dataset, the Aspect-level Emotional Cross-modal Relevance dataset (AECR-Twitter), which contains 3,562 image–text pairs. We also introduce several methods for integrating cross-modal relevance into MASC. Experimental results across eight different model architectures consistently demonstrate the effectiveness of our aspect-level emotional cross-modal relation scheme in enhancing MASC performance, with F1 scores increasing by an average of 1.26% on Twitter-15 and 1.28% on Twitter-17. We release our data and code at <span><span>https://github.com/li9527yu/AECR-Twitter</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":50365,"journal":{"name":"Information Processing & Management","volume":"63 2","pages":"Article 104427"},"PeriodicalIF":6.9000,"publicationDate":"2025-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Processing & Management","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0306457325003681","RegionNum":1,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

Multi-modal Aspect-based Sentiment Classification (MASC) determines the sentiment polarity of specific aspects in text–image pairs. Recent research has explored leveraging image–text relevance to improve MASC performance. However, existing approaches primarily focus on explicit alignments between textual aspects and visual objects or on the global relevance between entire texts and images, often overlooking the implicit emotional connections specific to aspects. In this work, we propose an aspect-level emotional cross-modal relation scheme that captures both explicit alignments and implicit emotional connections between text and image. Based on this scheme, we construct a new dataset, the Aspect-level Emotional Cross-modal Relevance dataset (AECR-Twitter), which contains 3,562 image–text pairs. We also introduce several methods for integrating cross-modal relevance into MASC. Experimental results across eight different model architectures consistently demonstrate the effectiveness of our aspect-level emotional cross-modal relation scheme in enhancing MASC performance, with F1 scores increasing by an average of 1.26% on Twitter-15 and 1.28% on Twitter-17. We release our data and code at https://github.com/li9527yu/AECR-Twitter.
通过情感语义感知的跨模态关系推理增强基于多模态方面的情感分类
基于多模态方面的情感分类(MASC)确定文本图像对中特定方面的情感极性。最近的研究探索了利用图像-文本相关性来提高MASC性能。然而,现有的方法主要关注文本方面与视觉对象之间的显式对齐或整个文本和图像之间的整体相关性,往往忽略了特定方面的隐含情感联系。在这项工作中,我们提出了一个方面层面的情感跨模态关系方案,该方案捕捉文本和图像之间的显式对齐和隐含情感联系。基于该方案,我们构建了一个新的数据集,即方面级情感跨模态关联数据集(AECR-Twitter),该数据集包含3,562对图像-文本。我们还介绍了几种将跨模态关联整合到MASC中的方法。八种不同模型架构的实验结果一致证明了我们的方面层面情感跨模态关系方案在提高MASC性能方面的有效性,在Twitter-15和Twitter-17上F1分数平均提高了1.26%和1.28%。我们在https://github.com/li9527yu/AECR-Twitter上发布数据和代码。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Information Processing & Management
Information Processing & Management 工程技术-计算机:信息系统
CiteScore
17.00
自引率
11.60%
发文量
276
审稿时长
39 days
期刊介绍: Information Processing and Management is dedicated to publishing cutting-edge original research at the convergence of computing and information science. Our scope encompasses theory, methods, and applications across various domains, including advertising, business, health, information science, information technology marketing, and social computing. We aim to cater to the interests of both primary researchers and practitioners by offering an effective platform for the timely dissemination of advanced and topical issues in this interdisciplinary field. The journal places particular emphasis on original research articles, research survey articles, research method articles, and articles addressing critical applications of research. Join us in advancing knowledge and innovation at the intersection of computing and information science.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信