{"title":"Multi-interest transfer using contrastive learning for cross-domain recommendation","authors":"Yu-Lin Lai , Szu-Hao Huang , Chiao-Ting Chen , Cheng-Jhang Wu","doi":"10.1016/j.dss.2025.114473","DOIUrl":null,"url":null,"abstract":"<div><div>With advancements in information technology, deep learning techniques have been widely applied to recommendation systems, substantially assisting businesses and users in making better decisions. However, it still faces some intractable limitations, such as the cold-start problem and data sparsity. Hence, cross-domain recommendations are proposed to address these problems by referring to the domains with richer data. Existing models usually apply domain- or user-level transferal to exchange information between domains. For domain-level transferals, information is transferred directly using a straightforward transformation without filtering. In contrast, user-level transferal sets trainable parameters to control the ratio of user embedding from two domains. The former is insufficiently precise for every user, and the latter encounters generalization issues. For these reasons, these methods ameliorate the cold-start problem but create a new problem: negative transfer. Thus, we propose an interest-level transferal called multi-interest transferal to more precisely extract multiple interests and transfer related ones according to the target items. Nevertheless, it is not easy to model interest correlations of different domains. We, therefore, devise three self-supervised learning tasks to model the correlations and extract discriminant information. The experimental results reveal that this model outperforms other state-of-the-art methods by about 7% to 10%. Through multi-interest and contrastive learning techniques, our approach can model the decision-making process more effectively in cross-domain recommendation.</div></div>","PeriodicalId":55181,"journal":{"name":"Decision Support Systems","volume":"195 ","pages":"Article 114473"},"PeriodicalIF":6.7000,"publicationDate":"2025-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Decision Support Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167923625000740","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
With advancements in information technology, deep learning techniques have been widely applied to recommendation systems, substantially assisting businesses and users in making better decisions. However, it still faces some intractable limitations, such as the cold-start problem and data sparsity. Hence, cross-domain recommendations are proposed to address these problems by referring to the domains with richer data. Existing models usually apply domain- or user-level transferal to exchange information between domains. For domain-level transferals, information is transferred directly using a straightforward transformation without filtering. In contrast, user-level transferal sets trainable parameters to control the ratio of user embedding from two domains. The former is insufficiently precise for every user, and the latter encounters generalization issues. For these reasons, these methods ameliorate the cold-start problem but create a new problem: negative transfer. Thus, we propose an interest-level transferal called multi-interest transferal to more precisely extract multiple interests and transfer related ones according to the target items. Nevertheless, it is not easy to model interest correlations of different domains. We, therefore, devise three self-supervised learning tasks to model the correlations and extract discriminant information. The experimental results reveal that this model outperforms other state-of-the-art methods by about 7% to 10%. Through multi-interest and contrastive learning techniques, our approach can model the decision-making process more effectively in cross-domain recommendation.
期刊介绍:
The common thread of articles published in Decision Support Systems is their relevance to theoretical and technical issues in the support of enhanced decision making. The areas addressed may include foundations, functionality, interfaces, implementation, impacts, and evaluation of decision support systems (DSSs).