CGT:用于基于方面的情感分析的条款图转换器结构

IF 2.7 3区 计算机科学 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Zelong Su , Bin Gao , Xiaoou Pan , Zhengjun Liu , Yu Ji , Shutian Liu
{"title":"CGT:用于基于方面的情感分析的条款图转换器结构","authors":"Zelong Su ,&nbsp;Bin Gao ,&nbsp;Xiaoou Pan ,&nbsp;Zhengjun Liu ,&nbsp;Yu Ji ,&nbsp;Shutian Liu","doi":"10.1016/j.datak.2024.102332","DOIUrl":null,"url":null,"abstract":"<div><p>In the realm of natural language processing (NLP), aspect-based sentiment analysis plays a pivotal role. Recently, there has been a growing emphasis on techniques leveraging Graph Convolutional Neural Network (GCN). However, there are several challenges associated with current approaches: (1) Due to the inherent transitivity of CGN, training inevitably entails the acquisition of irrelevant semantic information. (2) Existing methodologies heavily depend on the dependency tree, neglecting to consider the contextual structure of the sentence. (3) Another limitation of the majority of methods is their failure to account for the interactions occurring between different aspects. In this study, we propose a Clause Graph Transformer Structure (CGT) to alleviate these limitations. Specifically, CGT comprises three modules. The preprocessing module extracts aspect clauses from each sentence by bi-directionally traversing the constituent tree, reducing reliance on syntax trees and extracting semantic information from the perspective of clauses. Additionally, we assert that a word’s vector direction signifies its underlying attitude in the semantic space, a feature often overlooked in recent research. Without the necessity for additional parameters, we introduce the Clause Attention encoder (CA-encoder) to the clause module to effectively capture the directed cross-correlation coefficient between the clause and the target aspect. To enhance the representation of the target component, we propose capturing the connections between various aspects. In the inter-aspect module, we intricately design a Balanced Attention encoder (BA-encoder) that forms an aspect sequence by navigating the extracted phrase tree. To effectively capture the emotion of implicit components, we introduce a Top-K Attention Graph Convolutional Network (KA-GCN). Our proposed method has showcased state-of-the-art (SOTA) performance through experiments conducted on four widely used datasets. Furthermore, our model demonstrates a significant improvement in the robustness of datasets subjected to disturbances.</p></div>","PeriodicalId":55184,"journal":{"name":"Data & Knowledge Engineering","volume":null,"pages":null},"PeriodicalIF":2.7000,"publicationDate":"2024-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"CGT: A Clause Graph Transformer Structure for aspect-based sentiment analysis\",\"authors\":\"Zelong Su ,&nbsp;Bin Gao ,&nbsp;Xiaoou Pan ,&nbsp;Zhengjun Liu ,&nbsp;Yu Ji ,&nbsp;Shutian Liu\",\"doi\":\"10.1016/j.datak.2024.102332\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In the realm of natural language processing (NLP), aspect-based sentiment analysis plays a pivotal role. Recently, there has been a growing emphasis on techniques leveraging Graph Convolutional Neural Network (GCN). However, there are several challenges associated with current approaches: (1) Due to the inherent transitivity of CGN, training inevitably entails the acquisition of irrelevant semantic information. (2) Existing methodologies heavily depend on the dependency tree, neglecting to consider the contextual structure of the sentence. (3) Another limitation of the majority of methods is their failure to account for the interactions occurring between different aspects. In this study, we propose a Clause Graph Transformer Structure (CGT) to alleviate these limitations. Specifically, CGT comprises three modules. The preprocessing module extracts aspect clauses from each sentence by bi-directionally traversing the constituent tree, reducing reliance on syntax trees and extracting semantic information from the perspective of clauses. Additionally, we assert that a word’s vector direction signifies its underlying attitude in the semantic space, a feature often overlooked in recent research. Without the necessity for additional parameters, we introduce the Clause Attention encoder (CA-encoder) to the clause module to effectively capture the directed cross-correlation coefficient between the clause and the target aspect. To enhance the representation of the target component, we propose capturing the connections between various aspects. In the inter-aspect module, we intricately design a Balanced Attention encoder (BA-encoder) that forms an aspect sequence by navigating the extracted phrase tree. To effectively capture the emotion of implicit components, we introduce a Top-K Attention Graph Convolutional Network (KA-GCN). Our proposed method has showcased state-of-the-art (SOTA) performance through experiments conducted on four widely used datasets. Furthermore, our model demonstrates a significant improvement in the robustness of datasets subjected to disturbances.</p></div>\",\"PeriodicalId\":55184,\"journal\":{\"name\":\"Data & Knowledge Engineering\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2024-06-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Data & Knowledge Engineering\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0169023X24000569\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Data & Knowledge Engineering","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0169023X24000569","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

在自然语言处理(NLP)领域,基于方面的情感分析起着举足轻重的作用。最近,人们越来越重视利用图卷积神经网络(GCN)的技术。然而,目前的方法面临着几个挑战:(1) 由于图卷积神经网络固有的传递性,训练不可避免地会获取不相关的语义信息。(2) 现有方法严重依赖依赖树,忽略了句子的上下文结构。(3) 大多数方法的另一个局限是没有考虑到不同方面之间的相互作用。在本研究中,我们提出了语句图转换器结构(CGT)来缓解这些局限性。具体来说,CGT 包括三个模块。预处理模块通过双向遍历成分树从每个句子中提取方面分句,从而减少对句法树的依赖,并从分句的角度提取语义信息。此外,我们认为一个词的矢量方向标志着它在语义空间中的基本态度,而这一特征在最近的研究中经常被忽视。在不需要额外参数的情况下,我们在分句模块中引入了分句注意编码器(CA-encoder),以有效捕捉分句与目标方面之间的定向交叉相关系数。为了增强目标成分的表示,我们建议捕捉各方面之间的联系。在方面间模块中,我们复杂地设计了一个平衡注意力编码器(BA-encoder),通过导航提取的短语树形成一个方面序列。为了有效捕捉隐含成分的情感,我们引入了顶层注意力图卷积网络(KA-GCN)。通过在四个广泛使用的数据集上进行实验,我们提出的方法展示了最先进的(SOTA)性能。此外,我们的模型还显著提高了数据集受干扰时的鲁棒性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
CGT: A Clause Graph Transformer Structure for aspect-based sentiment analysis

In the realm of natural language processing (NLP), aspect-based sentiment analysis plays a pivotal role. Recently, there has been a growing emphasis on techniques leveraging Graph Convolutional Neural Network (GCN). However, there are several challenges associated with current approaches: (1) Due to the inherent transitivity of CGN, training inevitably entails the acquisition of irrelevant semantic information. (2) Existing methodologies heavily depend on the dependency tree, neglecting to consider the contextual structure of the sentence. (3) Another limitation of the majority of methods is their failure to account for the interactions occurring between different aspects. In this study, we propose a Clause Graph Transformer Structure (CGT) to alleviate these limitations. Specifically, CGT comprises three modules. The preprocessing module extracts aspect clauses from each sentence by bi-directionally traversing the constituent tree, reducing reliance on syntax trees and extracting semantic information from the perspective of clauses. Additionally, we assert that a word’s vector direction signifies its underlying attitude in the semantic space, a feature often overlooked in recent research. Without the necessity for additional parameters, we introduce the Clause Attention encoder (CA-encoder) to the clause module to effectively capture the directed cross-correlation coefficient between the clause and the target aspect. To enhance the representation of the target component, we propose capturing the connections between various aspects. In the inter-aspect module, we intricately design a Balanced Attention encoder (BA-encoder) that forms an aspect sequence by navigating the extracted phrase tree. To effectively capture the emotion of implicit components, we introduce a Top-K Attention Graph Convolutional Network (KA-GCN). Our proposed method has showcased state-of-the-art (SOTA) performance through experiments conducted on four widely used datasets. Furthermore, our model demonstrates a significant improvement in the robustness of datasets subjected to disturbances.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Data & Knowledge Engineering
Data & Knowledge Engineering 工程技术-计算机:人工智能
CiteScore
5.00
自引率
0.00%
发文量
66
审稿时长
6 months
期刊介绍: Data & Knowledge Engineering (DKE) stimulates the exchange of ideas and interaction between these two related fields of interest. DKE reaches a world-wide audience of researchers, designers, managers and users. The major aim of the journal is to identify, investigate and analyze the underlying principles in the design and effective use of these systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信