Zainab Albujasim, D. Inkpen, Xuejun Han, Yuhong Guo
{"title":"Improving Word Embedding Using Variational Dropout","authors":"Zainab Albujasim, D. Inkpen, Xuejun Han, Yuhong Guo","doi":"10.32473/flairs.36.133326","DOIUrl":null,"url":null,"abstract":"Pre-trained word embeddings are essential in natural language processing (NLP). In recent years, many post-processing algorithms have been proposed to improve the pre-trained word embeddings. We present a novel method - Orthogonal Auto Encoder with Variational Dropout (OAEVD) for improving word embeddings based on orthogonal autoencoders and variational dropout. Specifically, the orthogonality constraint encourages more diversity in the latent space and increases semantic similarities between similar words, and variational dropout makes it more robust to overfitting. Empirical evaluation on a range of downstream NLP tasks, including semantic similarity, text classification, and concept categorization shows that our proposed method effectively improves the quality of pre-trained word embeddings. Moreover, the proposed method successfully reduces the dimensionality of pre-trained word embeddings while maintaining high performance.","PeriodicalId":302103,"journal":{"name":"The International FLAIRS Conference Proceedings","volume":"63 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The International FLAIRS Conference Proceedings","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.32473/flairs.36.133326","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Pre-trained word embeddings are essential in natural language processing (NLP). In recent years, many post-processing algorithms have been proposed to improve the pre-trained word embeddings. We present a novel method - Orthogonal Auto Encoder with Variational Dropout (OAEVD) for improving word embeddings based on orthogonal autoencoders and variational dropout. Specifically, the orthogonality constraint encourages more diversity in the latent space and increases semantic similarities between similar words, and variational dropout makes it more robust to overfitting. Empirical evaluation on a range of downstream NLP tasks, including semantic similarity, text classification, and concept categorization shows that our proposed method effectively improves the quality of pre-trained word embeddings. Moreover, the proposed method successfully reduces the dimensionality of pre-trained word embeddings while maintaining high performance.