Effect of word embedding vector dimensionality on sentiment analysis through short and long texts

Q2 Decision Sciences
Mohamed Chiny, Marouane Chihab, Abdelkarim Ait Lahcen, Omar Bencharef, Younes Chihab
{"title":"Effect of word embedding vector dimensionality on sentiment analysis through short and long texts","authors":"Mohamed Chiny, Marouane Chihab, Abdelkarim Ait Lahcen, Omar Bencharef, Younes Chihab","doi":"10.11591/ijai.v12.i2.pp823-830","DOIUrl":null,"url":null,"abstract":"<span lang=\"EN-US\">Word embedding has become the most popular method of lexical description in a given context in the natural language processing domain, especially through the word to vector (Word2Vec) and global vectors (GloVe) implementations. Since GloVe is a pre-trained model that provides access to word mapping vectors on many dimensionalities, a large number of applications rely on its prowess, especially in the field of sentiment analysis. However, in the literature, we found that in many cases, GloVe is implemented with arbitrary dimensionalities (often 300d) regardless of the length of the text to be analyzed. In this work, we conducted a study that identifies the effect of the dimensionality of word embedding mapping vectors on short and long texts in a sentiment analysis context. The results suggest that as the dimensionality of the vectors increases, the performance metrics of the model also increase for long texts. In contrast, for short texts, we recorded a threshold at which dimensionality does not matter.</span>","PeriodicalId":52221,"journal":{"name":"IAES International Journal of Artificial Intelligence","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IAES International Journal of Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.11591/ijai.v12.i2.pp823-830","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Decision Sciences","Score":null,"Total":0}
引用次数: 0

Abstract

Word embedding has become the most popular method of lexical description in a given context in the natural language processing domain, especially through the word to vector (Word2Vec) and global vectors (GloVe) implementations. Since GloVe is a pre-trained model that provides access to word mapping vectors on many dimensionalities, a large number of applications rely on its prowess, especially in the field of sentiment analysis. However, in the literature, we found that in many cases, GloVe is implemented with arbitrary dimensionalities (often 300d) regardless of the length of the text to be analyzed. In this work, we conducted a study that identifies the effect of the dimensionality of word embedding mapping vectors on short and long texts in a sentiment analysis context. The results suggest that as the dimensionality of the vectors increases, the performance metrics of the model also increase for long texts. In contrast, for short texts, we recorded a threshold at which dimensionality does not matter.
词嵌入向量维数对短文本和长文情感分析的影响
<span lang="EN-US">词嵌入已经成为自然语言处理领域中给定上下文中最流行的词汇描述方法,特别是通过词到向量(Word2Vec)和全局向量(GloVe)实现。由于GloVe是一个预先训练的模型,它提供了对多个维度上的词映射向量的访问,因此大量的应用程序依赖于它的能力,特别是在情感分析领域。然而,在文献中,我们发现在许多情况下,GloVe是用任意维度实现的(通常是300d),而不管要分析的文本的长度。在这项工作中,我们进行了一项研究,确定了在情感分析上下文中,词嵌入映射向量的维度对短文本和长文本的影响。结果表明,随着向量维数的增加,模型的性能指标对于长文本也有所提高。相比之下,对于短文本,我们记录了一个阈值,在该阈值上维度无关紧要。</span>
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IAES International Journal of Artificial Intelligence
IAES International Journal of Artificial Intelligence Decision Sciences-Information Systems and Management
CiteScore
3.90
自引率
0.00%
发文量
170
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信