{"title":"LGLFF:基于局部-全局特征的轻量级方面级情感分析模型","authors":"Hao Liang, Xiaopeng Cao, Kaili Wang","doi":"10.1145/3573942.3573967","DOIUrl":null,"url":null,"abstract":"Aspect-level sentiment analysis is highly dependent on local context. However, most models are overly concerned with global context and external semantic knowledge. This approach increases the training time of the models. We propose the LGLFF (Lightweight Global and Local Feature Fusion) model. Firstly, we introduce a Distilroberta pretrained model in the LGLFF to encode the global context. Secondly, we use the SRU++ (Simple Recurrent Unit) network to extract global features. Then we adjust the SRD (Semantic-Relative Distance) threshold size by different datasets, and use SRD to mask the global context to get the local context. Finally, we use the multi-head attention mechanism to learn the global and local context features. We do some experiments on three datasets: Twitter, Laptop, and Restaurant. The results show that our model performs better than other benchmark models.","PeriodicalId":103293,"journal":{"name":"Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition","volume":"183 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"LGLFF: A Lightweight Aspect-Level Sentiment Analysis Model Based on Local-Global Features\",\"authors\":\"Hao Liang, Xiaopeng Cao, Kaili Wang\",\"doi\":\"10.1145/3573942.3573967\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Aspect-level sentiment analysis is highly dependent on local context. However, most models are overly concerned with global context and external semantic knowledge. This approach increases the training time of the models. We propose the LGLFF (Lightweight Global and Local Feature Fusion) model. Firstly, we introduce a Distilroberta pretrained model in the LGLFF to encode the global context. Secondly, we use the SRU++ (Simple Recurrent Unit) network to extract global features. Then we adjust the SRD (Semantic-Relative Distance) threshold size by different datasets, and use SRD to mask the global context to get the local context. Finally, we use the multi-head attention mechanism to learn the global and local context features. We do some experiments on three datasets: Twitter, Laptop, and Restaurant. The results show that our model performs better than other benchmark models.\",\"PeriodicalId\":103293,\"journal\":{\"name\":\"Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition\",\"volume\":\"183 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-09-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3573942.3573967\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3573942.3573967","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
摘要
方面级情感分析高度依赖于本地上下文。然而,大多数模型都过于关注全局上下文和外部语义知识。这种方法增加了模型的训练时间。我们提出了LGLFF (Lightweight Global and Local Feature Fusion)模型。首先,我们在LGLFF中引入一个蒸馏roberta预训练模型来对全局上下文进行编码。其次,我们使用SRU++(简单循环单元)网络提取全局特征。然后根据不同的数据集调整SRD (Semantic-Relative Distance,语义相对距离)阈值大小,利用SRD掩盖全局上下文,得到局部上下文。最后,我们使用多头注意机制来学习全局和局部上下文特征。我们在三个数据集上做了一些实验:Twitter, Laptop和Restaurant。结果表明,该模型的性能优于其他基准模型。
LGLFF: A Lightweight Aspect-Level Sentiment Analysis Model Based on Local-Global Features
Aspect-level sentiment analysis is highly dependent on local context. However, most models are overly concerned with global context and external semantic knowledge. This approach increases the training time of the models. We propose the LGLFF (Lightweight Global and Local Feature Fusion) model. Firstly, we introduce a Distilroberta pretrained model in the LGLFF to encode the global context. Secondly, we use the SRU++ (Simple Recurrent Unit) network to extract global features. Then we adjust the SRD (Semantic-Relative Distance) threshold size by different datasets, and use SRD to mask the global context to get the local context. Finally, we use the multi-head attention mechanism to learn the global and local context features. We do some experiments on three datasets: Twitter, Laptop, and Restaurant. The results show that our model performs better than other benchmark models.