A Hybrid Framework for Session Context Modeling

Chenjia, Jiaxin Mao, LiuYiqun, YeZiyi, MaWeizhi, wangchao, Zhangmin, MaShaoping
{"title":"A Hybrid Framework for Session Context Modeling","authors":"Chenjia, Jiaxin Mao, LiuYiqun, YeZiyi, MaWeizhi, wangchao, Zhangmin, MaShaoping","doi":"10.1145/3448127","DOIUrl":null,"url":null,"abstract":"Understanding user intent is essential for various retrieval tasks. By leveraging contextual information within sessions, e.g., query history and user click behaviors, search systems can capture user intent more accurately and thus perform better. However, most existing systems only consider intra-session contexts and may suffer from the problem of lacking contextual information, because short search sessions account for a large proportion in practical scenarios. We believe that in these scenarios, considering more contexts, e.g., cross-session dependencies, may help alleviate the problem and contribute to better performance. Therefore, we propose a novel Hybrid framework for Session Context Modeling (HSCM), which realizes session-level multi-task learning based on the self-attention mechanism. To alleviate the problem of lacking contextual information within current sessions, HSCM exploits the cross-session contexts by sampling user interactions under similar search intents in the historical sessions and further aggregating them into the local contexts. Besides, application of the self-attention mechanism rather than RNN-based frameworks in modeling session-level sequences also helps (1) better capture interactions within sessions, (2) represent the session contexts in parallelization. Experimental results on two practical search datasets show that HSCM not only outperforms strong baseline solutions such as HiNT, CARS, and BERTserini in document ranking, but also performs significantly better than most existing query suggestion methods. According to the results in an additional experiment, we have also found that HSCM is superior to most ranking models in click prediction.","PeriodicalId":6934,"journal":{"name":"ACM Transactions on Information Systems (TOIS)","volume":"30 1","pages":"1 - 35"},"PeriodicalIF":0.0000,"publicationDate":"2021-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Information Systems (TOIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3448127","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

Abstract

Understanding user intent is essential for various retrieval tasks. By leveraging contextual information within sessions, e.g., query history and user click behaviors, search systems can capture user intent more accurately and thus perform better. However, most existing systems only consider intra-session contexts and may suffer from the problem of lacking contextual information, because short search sessions account for a large proportion in practical scenarios. We believe that in these scenarios, considering more contexts, e.g., cross-session dependencies, may help alleviate the problem and contribute to better performance. Therefore, we propose a novel Hybrid framework for Session Context Modeling (HSCM), which realizes session-level multi-task learning based on the self-attention mechanism. To alleviate the problem of lacking contextual information within current sessions, HSCM exploits the cross-session contexts by sampling user interactions under similar search intents in the historical sessions and further aggregating them into the local contexts. Besides, application of the self-attention mechanism rather than RNN-based frameworks in modeling session-level sequences also helps (1) better capture interactions within sessions, (2) represent the session contexts in parallelization. Experimental results on two practical search datasets show that HSCM not only outperforms strong baseline solutions such as HiNT, CARS, and BERTserini in document ranking, but also performs significantly better than most existing query suggestion methods. According to the results in an additional experiment, we have also found that HSCM is superior to most ranking models in click prediction.
会话上下文建模的混合框架
理解用户意图对于各种检索任务至关重要。通过利用会话中的上下文信息,例如查询历史和用户点击行为,搜索系统可以更准确地捕获用户意图,从而更好地执行。然而,大多数现有系统只考虑会话内上下文,并且可能存在缺乏上下文信息的问题,因为在实际场景中,短搜索会话占很大比例。我们相信,在这些情况下,考虑更多的上下文,例如,跨会话依赖关系,可能有助于缓解问题,并有助于提高性能。为此,我们提出了一种新的会话上下文建模(HSCM)混合框架,该框架基于自注意机制实现会话级多任务学习。为了缓解当前会话中缺乏上下文信息的问题,HSCM通过对历史会话中相似搜索意图下的用户交互进行采样,并进一步将其聚合到本地上下文中,从而利用跨会话上下文。此外,在会话级序列建模中应用自注意机制而不是基于rnn的框架也有助于(1)更好地捕获会话内的交互,(2)并行化表示会话上下文。在两个实际搜索数据集上的实验结果表明,HSCM不仅在文档排序方面优于HiNT、CARS和BERTserini等强基线解决方案,而且显著优于大多数现有的查询建议方法。根据另一个实验的结果,我们还发现HSCM在点击预测方面优于大多数排名模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信