{"title":"Pre-trained Contextualized Representation for Chinese Conversation Topic Classification","authors":"Yujun Zhou, Changliang Li, Saike He, Xiaoqi Wang, Yiming Qiu","doi":"10.1109/ISI.2019.8823172","DOIUrl":null,"url":null,"abstract":"Topic classification plays an important role in facilitating security-related applications, which can help people reduce data scope and acquire key information quickly. Conversation is one of the important ways of communication between people. The utterances in a conversation may contain vital clues, such as people’s opinions, emotions and political slants. To explore more effective approaches for Chinese conversational topic classification, in this paper, we propose a neural network architecture with pre-trained contextualized representation. We firstly apply pretrained BERT model to fine-tune and generate the conversational embeddings, which are the inputs of our neural network models. Then we design several models based on neural networks to extract task-oriented advanced features for topic classification. Experimental results indicate that the models based on our neural network architecture all outperform the baseline only fine-tuned with the pre-trained BERT model. It demonstrates that the pretrained representations are effective to Chinese conversational topic classification, and the proposed architecture can further capture the salient features from the representations. And we release the code and dataset of this paper that can be obtained from https://github.com/njoe9/pretrained_representation.","PeriodicalId":156130,"journal":{"name":"2019 IEEE International Conference on Intelligence and Security Informatics (ISI)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE International Conference on Intelligence and Security Informatics (ISI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISI.2019.8823172","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Topic classification plays an important role in facilitating security-related applications, which can help people reduce data scope and acquire key information quickly. Conversation is one of the important ways of communication between people. The utterances in a conversation may contain vital clues, such as people’s opinions, emotions and political slants. To explore more effective approaches for Chinese conversational topic classification, in this paper, we propose a neural network architecture with pre-trained contextualized representation. We firstly apply pretrained BERT model to fine-tune and generate the conversational embeddings, which are the inputs of our neural network models. Then we design several models based on neural networks to extract task-oriented advanced features for topic classification. Experimental results indicate that the models based on our neural network architecture all outperform the baseline only fine-tuned with the pre-trained BERT model. It demonstrates that the pretrained representations are effective to Chinese conversational topic classification, and the proposed architecture can further capture the salient features from the representations. And we release the code and dataset of this paper that can be obtained from https://github.com/njoe9/pretrained_representation.