{"title":"会话式问答的显式历史选择","authors":"Zhiyuan Zhang, Qiaoqiao Feng, Yujie Wang","doi":"10.1109/ICTAI56018.2022.00212","DOIUrl":null,"url":null,"abstract":"Topic shift is very common in multi-turn dialogues, making it a great challenge in the filed of conversational question answering. Existing methods usually select the most adjacent turns as history information, however, it is useless or even harmful in case of topic shift. This paper proposes two explicit history selection models: SHSM and DHSM, to address this issue. The former is a simple history selection model, which only selects $\\boldsymbol{k}$ previous history turns; and the latter is a dependent history selection model, which selects the most relevant $\\boldsymbol{k}$ history turns through a turn-dependent graph. The two models are then trained in a consistency framework. Experimental results on QuAC show that our model can cope with topic shift problem, and it outperforms existing state-of-the-art methods by 0.8 on $\\boldsymbol{F}_{\\mathbf{1}}$ score, 0.7 on HEQ-Q score, and 1.4 on HEQ-D score.","PeriodicalId":354314,"journal":{"name":"2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Explicit History Selection for Conversational Question Answering\",\"authors\":\"Zhiyuan Zhang, Qiaoqiao Feng, Yujie Wang\",\"doi\":\"10.1109/ICTAI56018.2022.00212\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Topic shift is very common in multi-turn dialogues, making it a great challenge in the filed of conversational question answering. Existing methods usually select the most adjacent turns as history information, however, it is useless or even harmful in case of topic shift. This paper proposes two explicit history selection models: SHSM and DHSM, to address this issue. The former is a simple history selection model, which only selects $\\\\boldsymbol{k}$ previous history turns; and the latter is a dependent history selection model, which selects the most relevant $\\\\boldsymbol{k}$ history turns through a turn-dependent graph. The two models are then trained in a consistency framework. Experimental results on QuAC show that our model can cope with topic shift problem, and it outperforms existing state-of-the-art methods by 0.8 on $\\\\boldsymbol{F}_{\\\\mathbf{1}}$ score, 0.7 on HEQ-Q score, and 1.4 on HEQ-D score.\",\"PeriodicalId\":354314,\"journal\":{\"name\":\"2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI)\",\"volume\":\"65 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICTAI56018.2022.00212\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICTAI56018.2022.00212","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Explicit History Selection for Conversational Question Answering
Topic shift is very common in multi-turn dialogues, making it a great challenge in the filed of conversational question answering. Existing methods usually select the most adjacent turns as history information, however, it is useless or even harmful in case of topic shift. This paper proposes two explicit history selection models: SHSM and DHSM, to address this issue. The former is a simple history selection model, which only selects $\boldsymbol{k}$ previous history turns; and the latter is a dependent history selection model, which selects the most relevant $\boldsymbol{k}$ history turns through a turn-dependent graph. The two models are then trained in a consistency framework. Experimental results on QuAC show that our model can cope with topic shift problem, and it outperforms existing state-of-the-art methods by 0.8 on $\boldsymbol{F}_{\mathbf{1}}$ score, 0.7 on HEQ-Q score, and 1.4 on HEQ-D score.