{"title":"ASMAM: An Answer Summarization Mechanism Based on Multi-layer Attention Model","authors":"","doi":"10.25236/ajcis.2023.060901","DOIUrl":null,"url":null,"abstract":"At present, deep learning technologies have been widely used in the field of natural language process, such as text summarization. In Community Question Answering (CQA), the answer summary could help users get a complete answer quickly. There are still some problems with the current answer summary mechanism, such as semantic inconsistency, repetition of words, etc. In order to solve this, we propose a novel mechanism called ASMAM, which stands for Answer Summarization based on Multi-layer Attention Mechanism. Based on the traditional sequence to sequence (seq2seq), we introduce self-attention and multi-head attention mechanism respectively during sentence and text encoding, which could improve text representation ability of the model. In order to solve “long distance dependence” of Recurrent Neural Network (RNN) and too many parameters of Long Short-Term Memory (LSTM), we all use gated recurrent unit (GRU) as the neuron at the encoder and decoder sides. Experiments over the Yahoo! Answers dataset demonstrate that the coherence and fluency of the generated summary are all superior to the benchmark model in ROUGE evaluation system.","PeriodicalId":387664,"journal":{"name":"Academic Journal of Computing & Information Science","volume":"37 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Academic Journal of Computing & Information Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.25236/ajcis.2023.060901","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
At present, deep learning technologies have been widely used in the field of natural language process, such as text summarization. In Community Question Answering (CQA), the answer summary could help users get a complete answer quickly. There are still some problems with the current answer summary mechanism, such as semantic inconsistency, repetition of words, etc. In order to solve this, we propose a novel mechanism called ASMAM, which stands for Answer Summarization based on Multi-layer Attention Mechanism. Based on the traditional sequence to sequence (seq2seq), we introduce self-attention and multi-head attention mechanism respectively during sentence and text encoding, which could improve text representation ability of the model. In order to solve “long distance dependence” of Recurrent Neural Network (RNN) and too many parameters of Long Short-Term Memory (LSTM), we all use gated recurrent unit (GRU) as the neuron at the encoder and decoder sides. Experiments over the Yahoo! Answers dataset demonstrate that the coherence and fluency of the generated summary are all superior to the benchmark model in ROUGE evaluation system.