Adaptive knowledge selection in dialogue systems: Accommodating diverse knowledge types, requirements, and generation models

IF 6.3 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Yao Zhang , Lang Qin , Zhongtian Bao , Hongru Liang , Jun Wang , Zhenglu Yang , Zhe Sun , Andrzej Cichocki
{"title":"Adaptive knowledge selection in dialogue systems: Accommodating diverse knowledge types, requirements, and generation models","authors":"Yao Zhang ,&nbsp;Lang Qin ,&nbsp;Zhongtian Bao ,&nbsp;Hongru Liang ,&nbsp;Jun Wang ,&nbsp;Zhenglu Yang ,&nbsp;Zhe Sun ,&nbsp;Andrzej Cichocki","doi":"10.1016/j.neunet.2025.108133","DOIUrl":null,"url":null,"abstract":"<div><div>Effective knowledge-grounded dialogue systems rely heavily on accurate knowledge selection. This paper begins with an innovative new perspective that categorizes research on knowledge selection based on when knowledge is selected in relation to response generation: pre-, joint-, and post-selection. Among these, pre-selection is of great interest nowadays because they endeavor to provide sufficiently relevant knowledge inputs for downstream response generation models in advance. This reduces the burden of learning, adjusting, and interpreting for the subsequent response generation models, particularly for Large Language Models. Current knowledge pre-selection methods, however, still face three significant challenges: how to cope with different types of knowledge, adapt to the various knowledge requirements in different dialogue contexts, and adapt to different generation models. To resolve the above challenges, we propose ASK, an adaptive knowledge pre-selection method. It unifies various types of knowledge, scores their relevance and contribution to generating desired responses, and adapts the knowledge pool size to ensure the optimal amount is available for generation models. ASK is enhanced by leveraging rewards for selecting appropriate knowledge in both quality and quantity, through a reinforcement learning framework. We perform exhaustive experiments on two benchmarks (WoW and OpenDialKG) and get the following conclusions: 1) ASK has excellent knowledge selection capabilities on diverse knowledge types and requirements. 2) ASK significantly enhances the performance of various downstream generation models, including ChatGPT and GPT-4o. 3) The lightweight improvement of ASK saves 40 % of the computational consumption. Code is available at <span><span>https://github.com/AnonymousCode32213/ASK</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"194 ","pages":"Article 108133"},"PeriodicalIF":6.3000,"publicationDate":"2025-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025010135","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Effective knowledge-grounded dialogue systems rely heavily on accurate knowledge selection. This paper begins with an innovative new perspective that categorizes research on knowledge selection based on when knowledge is selected in relation to response generation: pre-, joint-, and post-selection. Among these, pre-selection is of great interest nowadays because they endeavor to provide sufficiently relevant knowledge inputs for downstream response generation models in advance. This reduces the burden of learning, adjusting, and interpreting for the subsequent response generation models, particularly for Large Language Models. Current knowledge pre-selection methods, however, still face three significant challenges: how to cope with different types of knowledge, adapt to the various knowledge requirements in different dialogue contexts, and adapt to different generation models. To resolve the above challenges, we propose ASK, an adaptive knowledge pre-selection method. It unifies various types of knowledge, scores their relevance and contribution to generating desired responses, and adapts the knowledge pool size to ensure the optimal amount is available for generation models. ASK is enhanced by leveraging rewards for selecting appropriate knowledge in both quality and quantity, through a reinforcement learning framework. We perform exhaustive experiments on two benchmarks (WoW and OpenDialKG) and get the following conclusions: 1) ASK has excellent knowledge selection capabilities on diverse knowledge types and requirements. 2) ASK significantly enhances the performance of various downstream generation models, including ChatGPT and GPT-4o. 3) The lightweight improvement of ASK saves 40 % of the computational consumption. Code is available at https://github.com/AnonymousCode32213/ASK.
对话系统中的适应性知识选择:适应不同的知识类型、需求和生成模型。
有效的基于知识的对话系统在很大程度上依赖于准确的知识选择。本文从一个创新的视角出发,将知识选择研究分为前选择、联合选择和后选择。其中,预选是目前人们非常感兴趣的,因为它们努力提前为下游响应生成模型提供充分相关的知识输入。这减少了学习、调整和解释后续响应生成模型的负担,特别是对于大型语言模型。然而,目前的知识预选方法仍然面临着三个重大挑战:如何应对不同类型的知识,如何适应不同对话语境下的各种知识需求,以及如何适应不同的生成模型。为了解决上述问题,我们提出了自适应知识预选方法ASK。它统一各种类型的知识,对它们的相关性和对生成期望响应的贡献进行评分,并调整知识库的大小以确保生成模型可用的最优数量。通过强化学习框架,通过对在质量和数量上选择适当知识的奖励来增强ASK。我们在两个基准测试(WoW和OpenDialKG)上进行了详尽的实验,得到以下结论:1)ASK对不同的知识类型和需求具有出色的知识选择能力。2) ASK显著提高了ChatGPT和gpt - 40等多种下游生成模型的性能。3) ASK的轻量化改进节省了40%的计算消耗。代码可从https://github.com/AnonymousCode32213/ASK获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Neural Networks
Neural Networks 工程技术-计算机:人工智能
CiteScore
13.90
自引率
7.70%
发文量
425
审稿时长
67 days
期刊介绍: Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信