K-PERM: Personalized Response Generation Using Dynamic Knowledge Retrieval and Persona-Adaptive Queries

Kanak Raj, Kaushik Roy, Vamshi Bonagiri, Priyanshul Govil, K. Thirunarayan, Raxit Goswami, Manas Gaur
{"title":"K-PERM: Personalized Response Generation Using Dynamic Knowledge Retrieval and Persona-Adaptive Queries","authors":"Kanak Raj, Kaushik Roy, Vamshi Bonagiri, Priyanshul Govil, K. Thirunarayan, Raxit Goswami, Manas Gaur","doi":"10.1609/aaaiss.v3i1.31203","DOIUrl":null,"url":null,"abstract":"Personalizing conversational agents can enhance the quality of conversations and increase user engagement. However, they often lack external knowledge to appropriately tend to a user’s persona. This is crucial for practical applications like mental health support, nutrition planning, culturally sensitive conversations, or reducing toxic behavior in conversational agents. To enhance the relevance and comprehensiveness of personalized responses, we propose using a two-step approach that involves (1) selectively integrating user personas and (2) contextualizing the response by supplementing information from a background knowledge source. We develop K-PERM (Knowledge-guided PErsonalization with Reward Modulation), a dynamic conversational agent that combines these elements. K-PERM achieves state-of-the- art performance on the popular FoCus dataset, containing real-world personalized conversations concerning global landmarks.We show that using responses from K-PERM can improve performance in state-of-the-art LLMs (GPT 3.5) by 10.5%, highlighting the impact of K-PERM for personalizing chatbots.","PeriodicalId":516827,"journal":{"name":"Proceedings of the AAAI Symposium Series","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the AAAI Symposium Series","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1609/aaaiss.v3i1.31203","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Personalizing conversational agents can enhance the quality of conversations and increase user engagement. However, they often lack external knowledge to appropriately tend to a user’s persona. This is crucial for practical applications like mental health support, nutrition planning, culturally sensitive conversations, or reducing toxic behavior in conversational agents. To enhance the relevance and comprehensiveness of personalized responses, we propose using a two-step approach that involves (1) selectively integrating user personas and (2) contextualizing the response by supplementing information from a background knowledge source. We develop K-PERM (Knowledge-guided PErsonalization with Reward Modulation), a dynamic conversational agent that combines these elements. K-PERM achieves state-of-the- art performance on the popular FoCus dataset, containing real-world personalized conversations concerning global landmarks.We show that using responses from K-PERM can improve performance in state-of-the-art LLMs (GPT 3.5) by 10.5%, highlighting the impact of K-PERM for personalizing chatbots.
K-PERM:利用动态知识检索和角色自适应查询生成个性化回复
个性化对话代理可以提高对话质量,增加用户参与度。然而,它们往往缺乏外部知识,无法适当地照顾用户的角色。这对于心理健康支持、营养计划、文化敏感性对话或减少对话代理中的有毒行为等实际应用至关重要。为了提高个性化回复的相关性和全面性,我们提出了一种分两步走的方法,其中包括:(1)有选择地整合用户角色;(2)通过补充背景知识源的信息来使回复情景化。我们开发了 K-PERM(具有奖励调节功能的知识引导个性化),这是一种结合了这些要素的动态对话代理。K-PERM在流行的FoCus数据集上取得了最先进的性能,该数据集包含真实世界中有关全球地标的个性化对话。我们的研究表明,使用K-PERM的回复可以将最先进的LLM(GPT 3.5)的性能提高10.5%,这凸显了K-PERM对个性化聊天机器人的影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信