会话问题生成中以答案为中心的局部和全局信息融合

Panpan Lei, Xiao Sun
{"title":"会话问题生成中以答案为中心的局部和全局信息融合","authors":"Panpan Lei, Xiao Sun","doi":"10.1109/ICKG52313.2021.00067","DOIUrl":null,"url":null,"abstract":"Conversational Question Generation (CQG) is a new concern in Question Generation (QG) study. Recently Seq2Seq neural network model has been widely used in the QG area. CQG model is also based on the Seq2Seq neural network model. We note a problem: the CQG model's input is not a single sentence, but a long text and conversation history. Seq2Seq model can't effectively process long input, the model will generate questions not related to the answer. To solve this problem, we propose an answer-centric local and global information fusion model. We extract the evidence sentence containing the answer in the passage and encode the evidence sentence and the passage information separately. On the one hand, we add answer-centered position tags in the passage to reinforce the attention of information related to the answer. On the other hand, we put the key sentence into the question type prediction model. By combining the answer position embedding to predict the question type, and then put the predicted question types in the key sentence to guide the generation of the question. Finally, we use a gate mechanism to merge key sentence information and passage information. The experimental results show that we have achieved better results.","PeriodicalId":174126,"journal":{"name":"2021 IEEE International Conference on Big Knowledge (ICBK)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Answer-Centric Local and Global Information Fusion for Conversational Question Generation\",\"authors\":\"Panpan Lei, Xiao Sun\",\"doi\":\"10.1109/ICKG52313.2021.00067\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Conversational Question Generation (CQG) is a new concern in Question Generation (QG) study. Recently Seq2Seq neural network model has been widely used in the QG area. CQG model is also based on the Seq2Seq neural network model. We note a problem: the CQG model's input is not a single sentence, but a long text and conversation history. Seq2Seq model can't effectively process long input, the model will generate questions not related to the answer. To solve this problem, we propose an answer-centric local and global information fusion model. We extract the evidence sentence containing the answer in the passage and encode the evidence sentence and the passage information separately. On the one hand, we add answer-centered position tags in the passage to reinforce the attention of information related to the answer. On the other hand, we put the key sentence into the question type prediction model. By combining the answer position embedding to predict the question type, and then put the predicted question types in the key sentence to guide the generation of the question. Finally, we use a gate mechanism to merge key sentence information and passage information. The experimental results show that we have achieved better results.\",\"PeriodicalId\":174126,\"journal\":{\"name\":\"2021 IEEE International Conference on Big Knowledge (ICBK)\",\"volume\":\"15 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE International Conference on Big Knowledge (ICBK)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICKG52313.2021.00067\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Conference on Big Knowledge (ICBK)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICKG52313.2021.00067","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

会话问题生成(Conversational Question Generation, CQG)是问题生成研究中的一个新热点。近年来,Seq2Seq神经网络模型在QG领域得到了广泛的应用。CQG模型也是基于Seq2Seq神经网络模型。我们注意到一个问题:CQG模型的输入不是一个句子,而是一个很长的文本和对话历史。Seq2Seq模型不能有效处理长输入,模型会生成与答案不相关的问题。为了解决这个问题,我们提出了一个以答案为中心的局部和全局信息融合模型。我们提取出文章中包含答案的证据句,并将证据句和文章信息分别编码。一方面,我们在文章中添加以答案为中心的位置标签,以加强对与答案相关信息的关注。另一方面,我们将关键句放入题型预测模型中。通过结合答案位置嵌入来预测问题类型,然后将预测的问题类型放在关键句中来指导问题的生成。最后,我们使用门机制来合并关键句子信息和段落信息。实验结果表明,我们取得了较好的效果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Answer-Centric Local and Global Information Fusion for Conversational Question Generation
Conversational Question Generation (CQG) is a new concern in Question Generation (QG) study. Recently Seq2Seq neural network model has been widely used in the QG area. CQG model is also based on the Seq2Seq neural network model. We note a problem: the CQG model's input is not a single sentence, but a long text and conversation history. Seq2Seq model can't effectively process long input, the model will generate questions not related to the answer. To solve this problem, we propose an answer-centric local and global information fusion model. We extract the evidence sentence containing the answer in the passage and encode the evidence sentence and the passage information separately. On the one hand, we add answer-centered position tags in the passage to reinforce the attention of information related to the answer. On the other hand, we put the key sentence into the question type prediction model. By combining the answer position embedding to predict the question type, and then put the predicted question types in the key sentence to guide the generation of the question. Finally, we use a gate mechanism to merge key sentence information and passage information. The experimental results show that we have achieved better results.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信