LGLFF:基于局部-全局特征的轻量级方面级情感分析模型

Hao Liang, Xiaopeng Cao, Kaili Wang
{"title":"LGLFF:基于局部-全局特征的轻量级方面级情感分析模型","authors":"Hao Liang, Xiaopeng Cao, Kaili Wang","doi":"10.1145/3573942.3573967","DOIUrl":null,"url":null,"abstract":"Aspect-level sentiment analysis is highly dependent on local context. However, most models are overly concerned with global context and external semantic knowledge. This approach increases the training time of the models. We propose the LGLFF (Lightweight Global and Local Feature Fusion) model. Firstly, we introduce a Distilroberta pretrained model in the LGLFF to encode the global context. Secondly, we use the SRU++ (Simple Recurrent Unit) network to extract global features. Then we adjust the SRD (Semantic-Relative Distance) threshold size by different datasets, and use SRD to mask the global context to get the local context. Finally, we use the multi-head attention mechanism to learn the global and local context features. We do some experiments on three datasets: Twitter, Laptop, and Restaurant. The results show that our model performs better than other benchmark models.","PeriodicalId":103293,"journal":{"name":"Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition","volume":"183 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"LGLFF: A Lightweight Aspect-Level Sentiment Analysis Model Based on Local-Global Features\",\"authors\":\"Hao Liang, Xiaopeng Cao, Kaili Wang\",\"doi\":\"10.1145/3573942.3573967\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Aspect-level sentiment analysis is highly dependent on local context. However, most models are overly concerned with global context and external semantic knowledge. This approach increases the training time of the models. We propose the LGLFF (Lightweight Global and Local Feature Fusion) model. Firstly, we introduce a Distilroberta pretrained model in the LGLFF to encode the global context. Secondly, we use the SRU++ (Simple Recurrent Unit) network to extract global features. Then we adjust the SRD (Semantic-Relative Distance) threshold size by different datasets, and use SRD to mask the global context to get the local context. Finally, we use the multi-head attention mechanism to learn the global and local context features. We do some experiments on three datasets: Twitter, Laptop, and Restaurant. The results show that our model performs better than other benchmark models.\",\"PeriodicalId\":103293,\"journal\":{\"name\":\"Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition\",\"volume\":\"183 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-09-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3573942.3573967\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3573942.3573967","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

方面级情感分析高度依赖于本地上下文。然而,大多数模型都过于关注全局上下文和外部语义知识。这种方法增加了模型的训练时间。我们提出了LGLFF (Lightweight Global and Local Feature Fusion)模型。首先,我们在LGLFF中引入一个蒸馏roberta预训练模型来对全局上下文进行编码。其次,我们使用SRU++(简单循环单元)网络提取全局特征。然后根据不同的数据集调整SRD (Semantic-Relative Distance,语义相对距离)阈值大小,利用SRD掩盖全局上下文,得到局部上下文。最后,我们使用多头注意机制来学习全局和局部上下文特征。我们在三个数据集上做了一些实验:Twitter, Laptop和Restaurant。结果表明,该模型的性能优于其他基准模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
LGLFF: A Lightweight Aspect-Level Sentiment Analysis Model Based on Local-Global Features
Aspect-level sentiment analysis is highly dependent on local context. However, most models are overly concerned with global context and external semantic knowledge. This approach increases the training time of the models. We propose the LGLFF (Lightweight Global and Local Feature Fusion) model. Firstly, we introduce a Distilroberta pretrained model in the LGLFF to encode the global context. Secondly, we use the SRU++ (Simple Recurrent Unit) network to extract global features. Then we adjust the SRD (Semantic-Relative Distance) threshold size by different datasets, and use SRD to mask the global context to get the local context. Finally, we use the multi-head attention mechanism to learn the global and local context features. We do some experiments on three datasets: Twitter, Laptop, and Restaurant. The results show that our model performs better than other benchmark models.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信