基于多层注意模型的答案汇总机制

{"title":"基于多层注意模型的答案汇总机制","authors":"","doi":"10.25236/ajcis.2023.060901","DOIUrl":null,"url":null,"abstract":"At present, deep learning technologies have been widely used in the field of natural language process, such as text summarization. In Community Question Answering (CQA), the answer summary could help users get a complete answer quickly. There are still some problems with the current answer summary mechanism, such as semantic inconsistency, repetition of words, etc. In order to solve this, we propose a novel mechanism called ASMAM, which stands for Answer Summarization based on Multi-layer Attention Mechanism. Based on the traditional sequence to sequence (seq2seq), we introduce self-attention and multi-head attention mechanism respectively during sentence and text encoding, which could improve text representation ability of the model. In order to solve “long distance dependence” of Recurrent Neural Network (RNN) and too many parameters of Long Short-Term Memory (LSTM), we all use gated recurrent unit (GRU) as the neuron at the encoder and decoder sides. Experiments over the Yahoo! Answers dataset demonstrate that the coherence and fluency of the generated summary are all superior to the benchmark model in ROUGE evaluation system.","PeriodicalId":387664,"journal":{"name":"Academic Journal of Computing & Information Science","volume":"37 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"ASMAM: An Answer Summarization Mechanism Based on Multi-layer Attention Model\",\"authors\":\"\",\"doi\":\"10.25236/ajcis.2023.060901\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"At present, deep learning technologies have been widely used in the field of natural language process, such as text summarization. In Community Question Answering (CQA), the answer summary could help users get a complete answer quickly. There are still some problems with the current answer summary mechanism, such as semantic inconsistency, repetition of words, etc. In order to solve this, we propose a novel mechanism called ASMAM, which stands for Answer Summarization based on Multi-layer Attention Mechanism. Based on the traditional sequence to sequence (seq2seq), we introduce self-attention and multi-head attention mechanism respectively during sentence and text encoding, which could improve text representation ability of the model. In order to solve “long distance dependence” of Recurrent Neural Network (RNN) and too many parameters of Long Short-Term Memory (LSTM), we all use gated recurrent unit (GRU) as the neuron at the encoder and decoder sides. Experiments over the Yahoo! Answers dataset demonstrate that the coherence and fluency of the generated summary are all superior to the benchmark model in ROUGE evaluation system.\",\"PeriodicalId\":387664,\"journal\":{\"name\":\"Academic Journal of Computing & Information Science\",\"volume\":\"37 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Academic Journal of Computing & Information Science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.25236/ajcis.2023.060901\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Academic Journal of Computing & Information Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.25236/ajcis.2023.060901","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

目前,深度学习技术已广泛应用于自然语言处理领域,如文本摘要。在CQA (Community Question answer)中,答案摘要可以帮助用户快速得到完整的答案。目前的答案摘要机制还存在语义不一致、单词重复等问题。为了解决这个问题,我们提出了一种新的机制ASMAM,即基于多层注意机制的回答摘要。在传统序列对序列(seq2seq)的基础上,在句子和文本编码过程中分别引入自注意机制和多头注意机制,提高了模型的文本表示能力。为了解决循环神经网络(RNN)的“长距离依赖”和长短期记忆(LSTM)参数过多的问题,在编码器和解码器侧均采用门控循环单元(GRU)作为神经元。Yahoo!答案数据表明,在ROUGE评价系统中,生成的摘要的连贯性和流畅性都优于基准模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
ASMAM: An Answer Summarization Mechanism Based on Multi-layer Attention Model
At present, deep learning technologies have been widely used in the field of natural language process, such as text summarization. In Community Question Answering (CQA), the answer summary could help users get a complete answer quickly. There are still some problems with the current answer summary mechanism, such as semantic inconsistency, repetition of words, etc. In order to solve this, we propose a novel mechanism called ASMAM, which stands for Answer Summarization based on Multi-layer Attention Mechanism. Based on the traditional sequence to sequence (seq2seq), we introduce self-attention and multi-head attention mechanism respectively during sentence and text encoding, which could improve text representation ability of the model. In order to solve “long distance dependence” of Recurrent Neural Network (RNN) and too many parameters of Long Short-Term Memory (LSTM), we all use gated recurrent unit (GRU) as the neuron at the encoder and decoder sides. Experiments over the Yahoo! Answers dataset demonstrate that the coherence and fluency of the generated summary are all superior to the benchmark model in ROUGE evaluation system.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信