Linjie Xia, Qiaoling Shen, Zicheng Wang, Yi Wang, Dewei Shu, Haipeng Li
{"title":"Seq2Seq-Attention based text topic representation","authors":"Linjie Xia, Qiaoling Shen, Zicheng Wang, Yi Wang, Dewei Shu, Haipeng Li","doi":"10.1109/CISCE58541.2023.10142391","DOIUrl":null,"url":null,"abstract":"After the issue of keywords not effectively representing the theme in topic discovery tasks, this paper proposes a text topic representation method based on Seq2Seq-Attention, aiming to create a concise and refined topic representation. Firstly, the Encoder module uses a two-layer bidirectional recurrent neural network for text information feature extraction, the Decoder sub-module uses a two-layer unidirectional recurrent neural network and combines with an attention mechanism model to complete the decoding task and outputs the desired sequence of popular topic characters. Finally, the effectiveness of the method is verified through comparative experiments.","PeriodicalId":145263,"journal":{"name":"2023 5th International Conference on Communications, Information System and Computer Engineering (CISCE)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 5th International Conference on Communications, Information System and Computer Engineering (CISCE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISCE58541.2023.10142391","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
After the issue of keywords not effectively representing the theme in topic discovery tasks, this paper proposes a text topic representation method based on Seq2Seq-Attention, aiming to create a concise and refined topic representation. Firstly, the Encoder module uses a two-layer bidirectional recurrent neural network for text information feature extraction, the Decoder sub-module uses a two-layer unidirectional recurrent neural network and combines with an attention mechanism model to complete the decoding task and outputs the desired sequence of popular topic characters. Finally, the effectiveness of the method is verified through comparative experiments.