{"title":"SCoEmbeddings","authors":"Hui Huang, Yueyuan Jin, Ruonan Rao","doi":"10.1145/3387902.3394948","DOIUrl":null,"url":null,"abstract":"Contextualized word representations such as ELMo embeddings, can capture rich semantic information and achieve impressive performance in a wide variety of NLP tasks. However, as problems found in Word2Vec and GloVe, we found that ELMo word embeddings also lack enough sentiment information, which may affect sentiment classification performance. Inspired by previous embedding refinement method with sentiment lexicon, we propose an approach that combines contextualized embeddings (ELMo) of the pre-trained model with sentiment information of lexicon to generate sentiment-contextualized embeddings, called SCoEmbeddings. Experimental results show that our SCoEmbeddings achieve higher accuracy than ELMo embeddings, Word2Vec embeddings, and refined Word2Vec embeddings on the SST-5 dataset. Meanwhile, we also visualize embeddings and weights of SCoEmbeddings, demonstrating the effectiveness of our SCoEmbeddings.","PeriodicalId":155089,"journal":{"name":"Proceedings of the 17th ACM International Conference on Computing Frontiers","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 17th ACM International Conference on Computing Frontiers","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3387902.3394948","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Contextualized word representations such as ELMo embeddings, can capture rich semantic information and achieve impressive performance in a wide variety of NLP tasks. However, as problems found in Word2Vec and GloVe, we found that ELMo word embeddings also lack enough sentiment information, which may affect sentiment classification performance. Inspired by previous embedding refinement method with sentiment lexicon, we propose an approach that combines contextualized embeddings (ELMo) of the pre-trained model with sentiment information of lexicon to generate sentiment-contextualized embeddings, called SCoEmbeddings. Experimental results show that our SCoEmbeddings achieve higher accuracy than ELMo embeddings, Word2Vec embeddings, and refined Word2Vec embeddings on the SST-5 dataset. Meanwhile, we also visualize embeddings and weights of SCoEmbeddings, demonstrating the effectiveness of our SCoEmbeddings.