Junsen Fu , Xianyong Li , Yihong Zhu , Yajun Du , Yongquan Fan , Xiaoliang Chen , Dong Huang , Shumin Wang
{"title":"使用监督对比学习和知识嵌入的基于隐性方面的情感分析方法","authors":"Junsen Fu , Xianyong Li , Yihong Zhu , Yajun Du , Yongquan Fan , Xiaoliang Chen , Dong Huang , Shumin Wang","doi":"10.1016/j.asoc.2024.112233","DOIUrl":null,"url":null,"abstract":"<div><p>Aspect-based sentiment analysis aims to analyze and understand people’s opinions from different aspects. Some comments do not contain explicit opinion words but still convey a clear human-perceived emotional orientation, which is known as implicit sentiment. Most previous research relies on contextual information from a text for implicit aspect-based sentiment analysis. However, little work has integrated external knowledge with contextual information. This paper proposes an implicit aspect-based sentiment analysis model combining supervised contrastive learning with knowledge-enhanced fine-tuning on BERT (BERT-SCL+KEFT). In the pre-training phase, the model utilizes supervised contrastive learning (SCL) on large-scale sentiment-annotated corpora to acquire sentiment knowledge. In the fine-tuning phase, the model uses a knowledge-enhanced fine-tuning (KEFT) method to capture explicit and implicit aspect-based sentiments. Specifically, the model utilizes knowledge embedding to embed external general knowledge information into textual entities by using knowledge graphs, enriching textual information. Finally, the model combines external knowledge and contextual features to predict the implicit sentiment in a text. The experimental results demonstrate that the proposed BERT-SCL+KEFT model outperforms other baselines on the general implicit sentiment analysis and implicit aspect-based sentiment analysis tasks. In addition, ablation experimental results show that the proposed BERT-SCL+KEFT model without the knowledge embedding module or supervised contrastive learning module significantly decreases performance, indicating the importance of these modules. All experiments validate that the proposed BERT-SCL+KEFT model effectively achieves implicit aspect-based sentiment classification.</p></div>","PeriodicalId":50737,"journal":{"name":"Applied Soft Computing","volume":null,"pages":null},"PeriodicalIF":7.2000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An implicit aspect-based sentiment analysis method using supervised contrastive learning and knowledge embedding\",\"authors\":\"Junsen Fu , Xianyong Li , Yihong Zhu , Yajun Du , Yongquan Fan , Xiaoliang Chen , Dong Huang , Shumin Wang\",\"doi\":\"10.1016/j.asoc.2024.112233\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Aspect-based sentiment analysis aims to analyze and understand people’s opinions from different aspects. Some comments do not contain explicit opinion words but still convey a clear human-perceived emotional orientation, which is known as implicit sentiment. Most previous research relies on contextual information from a text for implicit aspect-based sentiment analysis. However, little work has integrated external knowledge with contextual information. This paper proposes an implicit aspect-based sentiment analysis model combining supervised contrastive learning with knowledge-enhanced fine-tuning on BERT (BERT-SCL+KEFT). In the pre-training phase, the model utilizes supervised contrastive learning (SCL) on large-scale sentiment-annotated corpora to acquire sentiment knowledge. In the fine-tuning phase, the model uses a knowledge-enhanced fine-tuning (KEFT) method to capture explicit and implicit aspect-based sentiments. Specifically, the model utilizes knowledge embedding to embed external general knowledge information into textual entities by using knowledge graphs, enriching textual information. Finally, the model combines external knowledge and contextual features to predict the implicit sentiment in a text. The experimental results demonstrate that the proposed BERT-SCL+KEFT model outperforms other baselines on the general implicit sentiment analysis and implicit aspect-based sentiment analysis tasks. In addition, ablation experimental results show that the proposed BERT-SCL+KEFT model without the knowledge embedding module or supervised contrastive learning module significantly decreases performance, indicating the importance of these modules. All experiments validate that the proposed BERT-SCL+KEFT model effectively achieves implicit aspect-based sentiment classification.</p></div>\",\"PeriodicalId\":50737,\"journal\":{\"name\":\"Applied Soft Computing\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":7.2000,\"publicationDate\":\"2024-09-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied Soft Computing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S156849462401007X\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Soft Computing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S156849462401007X","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
An implicit aspect-based sentiment analysis method using supervised contrastive learning and knowledge embedding
Aspect-based sentiment analysis aims to analyze and understand people’s opinions from different aspects. Some comments do not contain explicit opinion words but still convey a clear human-perceived emotional orientation, which is known as implicit sentiment. Most previous research relies on contextual information from a text for implicit aspect-based sentiment analysis. However, little work has integrated external knowledge with contextual information. This paper proposes an implicit aspect-based sentiment analysis model combining supervised contrastive learning with knowledge-enhanced fine-tuning on BERT (BERT-SCL+KEFT). In the pre-training phase, the model utilizes supervised contrastive learning (SCL) on large-scale sentiment-annotated corpora to acquire sentiment knowledge. In the fine-tuning phase, the model uses a knowledge-enhanced fine-tuning (KEFT) method to capture explicit and implicit aspect-based sentiments. Specifically, the model utilizes knowledge embedding to embed external general knowledge information into textual entities by using knowledge graphs, enriching textual information. Finally, the model combines external knowledge and contextual features to predict the implicit sentiment in a text. The experimental results demonstrate that the proposed BERT-SCL+KEFT model outperforms other baselines on the general implicit sentiment analysis and implicit aspect-based sentiment analysis tasks. In addition, ablation experimental results show that the proposed BERT-SCL+KEFT model without the knowledge embedding module or supervised contrastive learning module significantly decreases performance, indicating the importance of these modules. All experiments validate that the proposed BERT-SCL+KEFT model effectively achieves implicit aspect-based sentiment classification.
期刊介绍:
Applied Soft Computing is an international journal promoting an integrated view of soft computing to solve real life problems.The focus is to publish the highest quality research in application and convergence of the areas of Fuzzy Logic, Neural Networks, Evolutionary Computing, Rough Sets and other similar techniques to address real world complexities.
Applied Soft Computing is a rolling publication: articles are published as soon as the editor-in-chief has accepted them. Therefore, the web site will continuously be updated with new articles and the publication time will be short.