{"title":"WRGAT-PTBERT:基于方面情感分析的后训练 BERT 加权关系图注意网络","authors":"Sharad Verma, Ashish Kumar, Aditi Sharan","doi":"10.1007/s10489-024-06011-x","DOIUrl":null,"url":null,"abstract":"<div><p>Aspect-based sentiment analysis (ABSA) focused on forecasting the sentiment orientation of a given aspect target within the input. Existing methods employ neural networks and attention mechanisms to encode input and discern aspect-context relationships. Bidirectional Encoder Representation from Transformer(BERT) has become the standard contextual encoding method in the textual domain. Researchers have ventured into utilizing graph attention networks(GAT) to incorporate syntactic information into the task, yielding cutting-edge results. However, current approaches overlook the potential advantages of considering word dependency relations. This work proposes a hybrid model combining contextual information obtained from a post-trained BERT with syntactic information from a relational GAT (RGAT) for the ABSA task. Our approach leverages dependency relation information effectively to improve ABSA performance in terms of accuracy and F1-score, as demonstrated through experiments on SemEval-14 Restaurant and Laptop, MAMS, and ACL-14 Twitter datasets.</p></div>","PeriodicalId":8041,"journal":{"name":"Applied Intelligence","volume":"55 2","pages":""},"PeriodicalIF":3.4000,"publicationDate":"2024-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"WRGAT-PTBERT: weighted relational graph attention network over post-trained BERT for aspect based sentiment analysis\",\"authors\":\"Sharad Verma, Ashish Kumar, Aditi Sharan\",\"doi\":\"10.1007/s10489-024-06011-x\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Aspect-based sentiment analysis (ABSA) focused on forecasting the sentiment orientation of a given aspect target within the input. Existing methods employ neural networks and attention mechanisms to encode input and discern aspect-context relationships. Bidirectional Encoder Representation from Transformer(BERT) has become the standard contextual encoding method in the textual domain. Researchers have ventured into utilizing graph attention networks(GAT) to incorporate syntactic information into the task, yielding cutting-edge results. However, current approaches overlook the potential advantages of considering word dependency relations. This work proposes a hybrid model combining contextual information obtained from a post-trained BERT with syntactic information from a relational GAT (RGAT) for the ABSA task. Our approach leverages dependency relation information effectively to improve ABSA performance in terms of accuracy and F1-score, as demonstrated through experiments on SemEval-14 Restaurant and Laptop, MAMS, and ACL-14 Twitter datasets.</p></div>\",\"PeriodicalId\":8041,\"journal\":{\"name\":\"Applied Intelligence\",\"volume\":\"55 2\",\"pages\":\"\"},\"PeriodicalIF\":3.4000,\"publicationDate\":\"2024-12-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied Intelligence\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s10489-024-06011-x\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Intelligence","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s10489-024-06011-x","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
WRGAT-PTBERT: weighted relational graph attention network over post-trained BERT for aspect based sentiment analysis
Aspect-based sentiment analysis (ABSA) focused on forecasting the sentiment orientation of a given aspect target within the input. Existing methods employ neural networks and attention mechanisms to encode input and discern aspect-context relationships. Bidirectional Encoder Representation from Transformer(BERT) has become the standard contextual encoding method in the textual domain. Researchers have ventured into utilizing graph attention networks(GAT) to incorporate syntactic information into the task, yielding cutting-edge results. However, current approaches overlook the potential advantages of considering word dependency relations. This work proposes a hybrid model combining contextual information obtained from a post-trained BERT with syntactic information from a relational GAT (RGAT) for the ABSA task. Our approach leverages dependency relation information effectively to improve ABSA performance in terms of accuracy and F1-score, as demonstrated through experiments on SemEval-14 Restaurant and Laptop, MAMS, and ACL-14 Twitter datasets.
期刊介绍:
With a focus on research in artificial intelligence and neural networks, this journal addresses issues involving solutions of real-life manufacturing, defense, management, government and industrial problems which are too complex to be solved through conventional approaches and require the simulation of intelligent thought processes, heuristics, applications of knowledge, and distributed and parallel processing. The integration of these multiple approaches in solving complex problems is of particular importance.
The journal presents new and original research and technological developments, addressing real and complex issues applicable to difficult problems. It provides a medium for exchanging scientific research and technological achievements accomplished by the international community.