{"title":"基于流的变分序列自动编码器","authors":"Jen-Tzung Chien, Tien-Ching Luo","doi":"10.23919/APSIPAASC55919.2022.9979970","DOIUrl":null,"url":null,"abstract":"Posterior collapse, also known as the Kullback-Leibler (KL) vanishing, is a long-standing problem in variational recurrent autoencoder (VRAE) which is essentially developed for sequence generation. To alleviate the vanishing problem, a complicated latent variable is required instead of assuming it as standard Gaussian. Normalizing flow was proposed to build the bijective neural network which converts a simple distribution into a complex distribution. The resulting approximate posterior is closer to real posterior for better sequence generation. The KL divergence in learning objective is accordingly preserved to enrich the capability of generating the diverse sequences. This paper presents the flow-based VRAE to build the disentangled latent representation for sequence generation. KL preserving flows are exploited for conditional VRAE and evaluated for text representation as well as dialogue generation. In the im-plementation, the schemes of amortized regularization and skip connection are further imposed to strengthen the embedding and prediction. Experiments on different tasks show the merit of this latent variable representation for language modeling, sentiment classification and dialogue generation.","PeriodicalId":382967,"journal":{"name":"2022 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Flow-Based Variational Sequence Autoencoder\",\"authors\":\"Jen-Tzung Chien, Tien-Ching Luo\",\"doi\":\"10.23919/APSIPAASC55919.2022.9979970\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Posterior collapse, also known as the Kullback-Leibler (KL) vanishing, is a long-standing problem in variational recurrent autoencoder (VRAE) which is essentially developed for sequence generation. To alleviate the vanishing problem, a complicated latent variable is required instead of assuming it as standard Gaussian. Normalizing flow was proposed to build the bijective neural network which converts a simple distribution into a complex distribution. The resulting approximate posterior is closer to real posterior for better sequence generation. The KL divergence in learning objective is accordingly preserved to enrich the capability of generating the diverse sequences. This paper presents the flow-based VRAE to build the disentangled latent representation for sequence generation. KL preserving flows are exploited for conditional VRAE and evaluated for text representation as well as dialogue generation. In the im-plementation, the schemes of amortized regularization and skip connection are further imposed to strengthen the embedding and prediction. Experiments on different tasks show the merit of this latent variable representation for language modeling, sentiment classification and dialogue generation.\",\"PeriodicalId\":382967,\"journal\":{\"name\":\"2022 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC)\",\"volume\":\"11 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-11-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.23919/APSIPAASC55919.2022.9979970\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/APSIPAASC55919.2022.9979970","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Posterior collapse, also known as the Kullback-Leibler (KL) vanishing, is a long-standing problem in variational recurrent autoencoder (VRAE) which is essentially developed for sequence generation. To alleviate the vanishing problem, a complicated latent variable is required instead of assuming it as standard Gaussian. Normalizing flow was proposed to build the bijective neural network which converts a simple distribution into a complex distribution. The resulting approximate posterior is closer to real posterior for better sequence generation. The KL divergence in learning objective is accordingly preserved to enrich the capability of generating the diverse sequences. This paper presents the flow-based VRAE to build the disentangled latent representation for sequence generation. KL preserving flows are exploited for conditional VRAE and evaluated for text representation as well as dialogue generation. In the im-plementation, the schemes of amortized regularization and skip connection are further imposed to strengthen the embedding and prediction. Experiments on different tasks show the merit of this latent variable representation for language modeling, sentiment classification and dialogue generation.