Multi-interest transfer using contrastive learning for cross-domain recommendation

IF 6.7 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Yu-Lin Lai , Szu-Hao Huang , Chiao-Ting Chen , Cheng-Jhang Wu
{"title":"Multi-interest transfer using contrastive learning for cross-domain recommendation","authors":"Yu-Lin Lai ,&nbsp;Szu-Hao Huang ,&nbsp;Chiao-Ting Chen ,&nbsp;Cheng-Jhang Wu","doi":"10.1016/j.dss.2025.114473","DOIUrl":null,"url":null,"abstract":"<div><div>With advancements in information technology, deep learning techniques have been widely applied to recommendation systems, substantially assisting businesses and users in making better decisions. However, it still faces some intractable limitations, such as the cold-start problem and data sparsity. Hence, cross-domain recommendations are proposed to address these problems by referring to the domains with richer data. Existing models usually apply domain- or user-level transferal to exchange information between domains. For domain-level transferals, information is transferred directly using a straightforward transformation without filtering. In contrast, user-level transferal sets trainable parameters to control the ratio of user embedding from two domains. The former is insufficiently precise for every user, and the latter encounters generalization issues. For these reasons, these methods ameliorate the cold-start problem but create a new problem: negative transfer. Thus, we propose an interest-level transferal called multi-interest transferal to more precisely extract multiple interests and transfer related ones according to the target items. Nevertheless, it is not easy to model interest correlations of different domains. We, therefore, devise three self-supervised learning tasks to model the correlations and extract discriminant information. The experimental results reveal that this model outperforms other state-of-the-art methods by about 7% to 10%. Through multi-interest and contrastive learning techniques, our approach can model the decision-making process more effectively in cross-domain recommendation.</div></div>","PeriodicalId":55181,"journal":{"name":"Decision Support Systems","volume":"195 ","pages":"Article 114473"},"PeriodicalIF":6.7000,"publicationDate":"2025-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Decision Support Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167923625000740","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

With advancements in information technology, deep learning techniques have been widely applied to recommendation systems, substantially assisting businesses and users in making better decisions. However, it still faces some intractable limitations, such as the cold-start problem and data sparsity. Hence, cross-domain recommendations are proposed to address these problems by referring to the domains with richer data. Existing models usually apply domain- or user-level transferal to exchange information between domains. For domain-level transferals, information is transferred directly using a straightforward transformation without filtering. In contrast, user-level transferal sets trainable parameters to control the ratio of user embedding from two domains. The former is insufficiently precise for every user, and the latter encounters generalization issues. For these reasons, these methods ameliorate the cold-start problem but create a new problem: negative transfer. Thus, we propose an interest-level transferal called multi-interest transferal to more precisely extract multiple interests and transfer related ones according to the target items. Nevertheless, it is not easy to model interest correlations of different domains. We, therefore, devise three self-supervised learning tasks to model the correlations and extract discriminant information. The experimental results reveal that this model outperforms other state-of-the-art methods by about 7% to 10%. Through multi-interest and contrastive learning techniques, our approach can model the decision-making process more effectively in cross-domain recommendation.
基于对比学习的多兴趣迁移跨领域推荐
随着信息技术的进步,深度学习技术已被广泛应用于推荐系统,极大地帮助企业和用户做出更好的决策。然而,它仍然面临一些棘手的限制,如冷启动问题和数据稀疏性。因此,为了解决这些问题,我们提出了跨领域的建议,即参考具有更丰富数据的领域。现有模型通常采用域级或用户级转移来交换域之间的信息。对于域级传输,信息直接使用直接转换而不进行过滤。相比之下,用户级转移设置可训练的参数来控制用户从两个域嵌入的比例。前者对每个用户都不够精确,而后者则遇到泛化问题。由于这些原因,这些方法改善了冷启动问题,但产生了一个新的问题:负转移。因此,我们提出了一种利益层面的转移,称为多利益转移,可以更精确地提取多个利益,并根据目标项目转移相关的利益。然而,对不同领域的兴趣相关性进行建模并不容易。因此,我们设计了三个自监督学习任务来建模相关性并提取判别信息。实验结果表明,该模型的性能优于其他最先进的方法约7%至10%。通过多兴趣和对比学习技术,我们的方法可以更有效地模拟跨领域推荐的决策过程。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Decision Support Systems
Decision Support Systems 工程技术-计算机:人工智能
CiteScore
14.70
自引率
6.70%
发文量
119
审稿时长
13 months
期刊介绍: The common thread of articles published in Decision Support Systems is their relevance to theoretical and technical issues in the support of enhanced decision making. The areas addressed may include foundations, functionality, interfaces, implementation, impacts, and evaluation of decision support systems (DSSs).
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信