Improving Word Embedding Using Variational Dropout

Zainab Albujasim, D. Inkpen, Xuejun Han, Yuhong Guo
{"title":"Improving Word Embedding Using Variational Dropout","authors":"Zainab Albujasim, D. Inkpen, Xuejun Han, Yuhong Guo","doi":"10.32473/flairs.36.133326","DOIUrl":null,"url":null,"abstract":"Pre-trained word embeddings are essential in natural language processing (NLP). In recent years, many post-processing algorithms have been proposed to improve the pre-trained word embeddings. We present a novel method - Orthogonal Auto Encoder with Variational Dropout (OAEVD) for improving word embeddings based on orthogonal autoencoders and variational dropout.  Specifically, the orthogonality constraint encourages more diversity in the latent space and increases semantic similarities between similar words, and variational dropout makes it more robust to overfitting.   Empirical evaluation on a range of downstream NLP tasks, including semantic similarity, text classification, and concept categorization shows that our proposed method effectively improves the quality of pre-trained word embeddings. Moreover, the proposed method successfully reduces the dimensionality of pre-trained word embeddings while maintaining high performance.","PeriodicalId":302103,"journal":{"name":"The International FLAIRS Conference Proceedings","volume":"63 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The International FLAIRS Conference Proceedings","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.32473/flairs.36.133326","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Pre-trained word embeddings are essential in natural language processing (NLP). In recent years, many post-processing algorithms have been proposed to improve the pre-trained word embeddings. We present a novel method - Orthogonal Auto Encoder with Variational Dropout (OAEVD) for improving word embeddings based on orthogonal autoencoders and variational dropout.  Specifically, the orthogonality constraint encourages more diversity in the latent space and increases semantic similarities between similar words, and variational dropout makes it more robust to overfitting.   Empirical evaluation on a range of downstream NLP tasks, including semantic similarity, text classification, and concept categorization shows that our proposed method effectively improves the quality of pre-trained word embeddings. Moreover, the proposed method successfully reduces the dimensionality of pre-trained word embeddings while maintaining high performance.
利用变分Dropout改进词嵌入
预训练词嵌入在自然语言处理(NLP)中是必不可少的。近年来,人们提出了许多后处理算法来改进预训练的词嵌入。在正交自编码器和变分Dropout的基础上,提出了一种改进词嵌入的新方法——正交变分Dropout (OAEVD)。具体来说,正交性约束鼓励了潜在空间的多样性,增加了相似词之间的语义相似性,变分dropout使其对过拟合更具鲁棒性。对语义相似度、文本分类和概念分类等一系列下游NLP任务的实证评估表明,我们提出的方法有效地提高了预训练词嵌入的质量。此外,该方法在保持高性能的同时,成功地降低了预训练词嵌入的维数。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信