Wenbin Zhang, Jiaju She, Yingqiu Wang, Meng Zhao, Yi Wang, Chao Liu
{"title":"Multi-hop Knowledge Base Q&A in Integrated Energy Services Based on Intermediate Reasoning Attention","authors":"Wenbin Zhang, Jiaju She, Yingqiu Wang, Meng Zhao, Yi Wang, Chao Liu","doi":"10.1109/ICSAI57119.2022.10005492","DOIUrl":null,"url":null,"abstract":"Knowledge base with multiple hops quizzing aims to discover the subject entity in a question at a distance from the knowledge base’s answer entity for multiple hops. The lack of supervised signals for the intermediate phases of multi-hop inference, which leaves a model only able to get input on the final output, is a significant difficulty for the study, where the inference instructions for the intermediate steps cannot be effectively optimized and the forward propagation of inference states is weakened. Most of the existing research approaches use global attention to motivate the model to learn the inference instructions of each hop, which has been shown to fail to achieve effective performance in weakly supervised tasks. To address this challenge, this paper proposes an intermediate inference attention mechanism to handle multi-hop knowledge base quizzing tasks. Inspired by the human execution of multi-hop quizzing where each hop question is influenced by the previous hop answer, in this approach, the model pays more attention to the inference state generated by the previous hop inference instruction when generating each hop inference instruction, prompting a close interaction between the inference state of the intermediate step and the inference instruction, and providing effective attentional feedback for the optimization of the intermediate step inference instruction. On the KBQA dataset in the integrated energy service domain, which is self-constructed in this research, we conduct comprehensive comparison experiments. The findings suggest that the technique we provided achieves optimum performance in this study.","PeriodicalId":339547,"journal":{"name":"2022 8th International Conference on Systems and Informatics (ICSAI)","volume":"97 1-4","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 8th International Conference on Systems and Informatics (ICSAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICSAI57119.2022.10005492","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Knowledge base with multiple hops quizzing aims to discover the subject entity in a question at a distance from the knowledge base’s answer entity for multiple hops. The lack of supervised signals for the intermediate phases of multi-hop inference, which leaves a model only able to get input on the final output, is a significant difficulty for the study, where the inference instructions for the intermediate steps cannot be effectively optimized and the forward propagation of inference states is weakened. Most of the existing research approaches use global attention to motivate the model to learn the inference instructions of each hop, which has been shown to fail to achieve effective performance in weakly supervised tasks. To address this challenge, this paper proposes an intermediate inference attention mechanism to handle multi-hop knowledge base quizzing tasks. Inspired by the human execution of multi-hop quizzing where each hop question is influenced by the previous hop answer, in this approach, the model pays more attention to the inference state generated by the previous hop inference instruction when generating each hop inference instruction, prompting a close interaction between the inference state of the intermediate step and the inference instruction, and providing effective attentional feedback for the optimization of the intermediate step inference instruction. On the KBQA dataset in the integrated energy service domain, which is self-constructed in this research, we conduct comprehensive comparison experiments. The findings suggest that the technique we provided achieves optimum performance in this study.