{"title":"A Relation Enhanced Model For Abstractive Dialogue Summarization","authors":"Pengyao Yi, Ruifang Liu","doi":"10.1109/CyberC55534.2022.00047","DOIUrl":null,"url":null,"abstract":"Traditional document summarization models perform less satisfactorily on dialogues due to the complex personal pronouns referential relationships and insufficient modeling of conversation. To address this problem, we propose a novel end-to-end Transformer-based model for abstractive dialogue summarization with Relation Enhanced method based on BART named RE-BART. Our model leverages local relation and global relation in a conversation to model dialogue and to generate better summaries. In detail, we consider that the verb and related arguments in a single utterance contribute to the local event for encoding the dialogue. And coreference information in a whole conversation represents the global relation which helps to trace the topic and information flow of the speakers. Then we design a dialogue relation enhanced model for modeling both information. Experiments on the SAMsum dataset show that our model outperforms various dialogue summarization approaches and achieves new state-of- the-art ROUGE results.","PeriodicalId":234632,"journal":{"name":"2022 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CyberC55534.2022.00047","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Traditional document summarization models perform less satisfactorily on dialogues due to the complex personal pronouns referential relationships and insufficient modeling of conversation. To address this problem, we propose a novel end-to-end Transformer-based model for abstractive dialogue summarization with Relation Enhanced method based on BART named RE-BART. Our model leverages local relation and global relation in a conversation to model dialogue and to generate better summaries. In detail, we consider that the verb and related arguments in a single utterance contribute to the local event for encoding the dialogue. And coreference information in a whole conversation represents the global relation which helps to trace the topic and information flow of the speakers. Then we design a dialogue relation enhanced model for modeling both information. Experiments on the SAMsum dataset show that our model outperforms various dialogue summarization approaches and achieves new state-of- the-art ROUGE results.