Graph Convolutional Networks with Dependency Parser towards Multiview Representation Learning for Sentiment Analysis

Minqiang Yang, Xinqi Liu, Chengsheng Mao, Bin Hu
{"title":"Graph Convolutional Networks with Dependency Parser towards Multiview Representation Learning for Sentiment Analysis","authors":"Minqiang Yang, Xinqi Liu, Chengsheng Mao, Bin Hu","doi":"10.1109/ICDMW58026.2022.00070","DOIUrl":null,"url":null,"abstract":"Sentiment analysis has become increasingly important in natural language processing (NLP). Recent efforts have been devoted to the graph convolutional network (GCN) due to its advantages in handling the complex information. However, the improvement of GCN in NLP is hindered because the pretrained word vectors do not fit well in various contexts and the traditional edge building methods are not suited well for the long and complex context. To address these problems, we propose the LSTM-GCN model to contextualize the pretrained word vectors and extract the sentiment representations from the complex texts. Particularly, LSTM-GCN captures the sentiment feature representations from multiple different perspectives including context and syntax. In addition to extracting contextual representation from pretrained word vectors, we utilize the dependency parser to analyse the dependency correlation between each word to extract the syntax representation. For each text, we build a graph with each word in the text as a node. Besides the edges between the neighboring words, we also connect the nodes with dependency correlation to capture syntax representations. Moreover, we introduce the message passing mechanism (MPM) which allows the nodes to update their representation by extract information from its neighbors. Also, to improve the message passing performance, we set the edges to be trainable and initialize the edge weights with the pointwise mutual information (PMI) method. The results of the experiments show that our LSTM-GCN model outperforms several state-of-the-art models. And extensive experiments validate the rationality and effectiveness of our model.","PeriodicalId":146687,"journal":{"name":"2022 IEEE International Conference on Data Mining Workshops (ICDMW)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Data Mining Workshops (ICDMW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDMW58026.2022.00070","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Sentiment analysis has become increasingly important in natural language processing (NLP). Recent efforts have been devoted to the graph convolutional network (GCN) due to its advantages in handling the complex information. However, the improvement of GCN in NLP is hindered because the pretrained word vectors do not fit well in various contexts and the traditional edge building methods are not suited well for the long and complex context. To address these problems, we propose the LSTM-GCN model to contextualize the pretrained word vectors and extract the sentiment representations from the complex texts. Particularly, LSTM-GCN captures the sentiment feature representations from multiple different perspectives including context and syntax. In addition to extracting contextual representation from pretrained word vectors, we utilize the dependency parser to analyse the dependency correlation between each word to extract the syntax representation. For each text, we build a graph with each word in the text as a node. Besides the edges between the neighboring words, we also connect the nodes with dependency correlation to capture syntax representations. Moreover, we introduce the message passing mechanism (MPM) which allows the nodes to update their representation by extract information from its neighbors. Also, to improve the message passing performance, we set the edges to be trainable and initialize the edge weights with the pointwise mutual information (PMI) method. The results of the experiments show that our LSTM-GCN model outperforms several state-of-the-art models. And extensive experiments validate the rationality and effectiveness of our model.
面向情感分析多视图表示学习的依赖解析器图卷积网络
情感分析在自然语言处理(NLP)中越来越重要。图卷积网络(GCN)由于其在处理复杂信息方面的优势,近年来得到了广泛的研究。然而,由于预训练的词向量不能很好地适应各种上下文,传统的边缘构建方法不能很好地适应长而复杂的上下文,阻碍了GCN在自然语言处理中的改进。为了解决这些问题,我们提出了LSTM-GCN模型来将预训练的词向量语境化,并从复杂文本中提取情感表示。特别是,LSTM-GCN从多个不同的角度(包括上下文和语法)捕获情感特征表示。除了从预训练的词向量中提取上下文表示外,我们还利用依赖解析器分析每个词之间的依赖关系以提取语法表示。对于每个文本,我们用文本中的每个单词作为节点构建一个图。除了相邻词之间的边,我们还使用依赖关系连接节点以捕获语法表示。此外,我们还引入了消息传递机制(MPM),该机制允许节点通过从其邻居中提取信息来更新其表示。此外,为了提高消息传递性能,我们将边缘设置为可训练的,并使用点互信息(PMI)方法初始化边缘权重。实验结果表明,我们的LSTM-GCN模型优于几种最先进的模型。大量的实验验证了该模型的合理性和有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信