{"title":"会话问题生成中以答案为中心的局部和全局信息融合","authors":"Panpan Lei, Xiao Sun","doi":"10.1109/ICKG52313.2021.00067","DOIUrl":null,"url":null,"abstract":"Conversational Question Generation (CQG) is a new concern in Question Generation (QG) study. Recently Seq2Seq neural network model has been widely used in the QG area. CQG model is also based on the Seq2Seq neural network model. We note a problem: the CQG model's input is not a single sentence, but a long text and conversation history. Seq2Seq model can't effectively process long input, the model will generate questions not related to the answer. To solve this problem, we propose an answer-centric local and global information fusion model. We extract the evidence sentence containing the answer in the passage and encode the evidence sentence and the passage information separately. On the one hand, we add answer-centered position tags in the passage to reinforce the attention of information related to the answer. On the other hand, we put the key sentence into the question type prediction model. By combining the answer position embedding to predict the question type, and then put the predicted question types in the key sentence to guide the generation of the question. Finally, we use a gate mechanism to merge key sentence information and passage information. The experimental results show that we have achieved better results.","PeriodicalId":174126,"journal":{"name":"2021 IEEE International Conference on Big Knowledge (ICBK)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Answer-Centric Local and Global Information Fusion for Conversational Question Generation\",\"authors\":\"Panpan Lei, Xiao Sun\",\"doi\":\"10.1109/ICKG52313.2021.00067\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Conversational Question Generation (CQG) is a new concern in Question Generation (QG) study. Recently Seq2Seq neural network model has been widely used in the QG area. CQG model is also based on the Seq2Seq neural network model. We note a problem: the CQG model's input is not a single sentence, but a long text and conversation history. Seq2Seq model can't effectively process long input, the model will generate questions not related to the answer. To solve this problem, we propose an answer-centric local and global information fusion model. We extract the evidence sentence containing the answer in the passage and encode the evidence sentence and the passage information separately. On the one hand, we add answer-centered position tags in the passage to reinforce the attention of information related to the answer. On the other hand, we put the key sentence into the question type prediction model. By combining the answer position embedding to predict the question type, and then put the predicted question types in the key sentence to guide the generation of the question. Finally, we use a gate mechanism to merge key sentence information and passage information. The experimental results show that we have achieved better results.\",\"PeriodicalId\":174126,\"journal\":{\"name\":\"2021 IEEE International Conference on Big Knowledge (ICBK)\",\"volume\":\"15 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE International Conference on Big Knowledge (ICBK)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICKG52313.2021.00067\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Conference on Big Knowledge (ICBK)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICKG52313.2021.00067","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Answer-Centric Local and Global Information Fusion for Conversational Question Generation
Conversational Question Generation (CQG) is a new concern in Question Generation (QG) study. Recently Seq2Seq neural network model has been widely used in the QG area. CQG model is also based on the Seq2Seq neural network model. We note a problem: the CQG model's input is not a single sentence, but a long text and conversation history. Seq2Seq model can't effectively process long input, the model will generate questions not related to the answer. To solve this problem, we propose an answer-centric local and global information fusion model. We extract the evidence sentence containing the answer in the passage and encode the evidence sentence and the passage information separately. On the one hand, we add answer-centered position tags in the passage to reinforce the attention of information related to the answer. On the other hand, we put the key sentence into the question type prediction model. By combining the answer position embedding to predict the question type, and then put the predicted question types in the key sentence to guide the generation of the question. Finally, we use a gate mechanism to merge key sentence information and passage information. The experimental results show that we have achieved better results.