Review-Based Hierarchical Attention Model Trained with Random Back-Transfer for Cross-Domain Recommendation

Kuan Feng, Yanmin Zhu
{"title":"Review-Based Hierarchical Attention Model Trained with Random Back-Transfer for Cross-Domain Recommendation","authors":"Kuan Feng, Yanmin Zhu","doi":"10.1109/ICPADS53394.2021.00090","DOIUrl":null,"url":null,"abstract":"Cross-domain recommendation aims to leverage the rich interaction information in the source domain to predict interactions between cold-start users and items in the target domain. Since reviews contain users' preferences and items' attributes, many review-based cross-domain recommendation methods are proposed. However, existing methods cannot either 1) select important words and reviews from multiple reviews of users/items, or 2) learn a unified representation space for different domains without enough overlapping users. To address these problems, we propose a Hierarchical Attention model trained with Random Back-Transfer for cross-domain recommendation (HARBT). Specifically, the hierarchical attention extracts text information related to a given user or item which leads to an accurate interaction prediction. The random back-transfer works as a data augmentation algorithm to utilize data of users and items which are in the same domain for better matching of representations in different domains. Extensive experiments on real-world datasets show that our approach outperforms state-of-the-art methods significantly.","PeriodicalId":309508,"journal":{"name":"2021 IEEE 27th International Conference on Parallel and Distributed Systems (ICPADS)","volume":"16 7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 27th International Conference on Parallel and Distributed Systems (ICPADS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICPADS53394.2021.00090","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Cross-domain recommendation aims to leverage the rich interaction information in the source domain to predict interactions between cold-start users and items in the target domain. Since reviews contain users' preferences and items' attributes, many review-based cross-domain recommendation methods are proposed. However, existing methods cannot either 1) select important words and reviews from multiple reviews of users/items, or 2) learn a unified representation space for different domains without enough overlapping users. To address these problems, we propose a Hierarchical Attention model trained with Random Back-Transfer for cross-domain recommendation (HARBT). Specifically, the hierarchical attention extracts text information related to a given user or item which leads to an accurate interaction prediction. The random back-transfer works as a data augmentation algorithm to utilize data of users and items which are in the same domain for better matching of representations in different domains. Extensive experiments on real-world datasets show that our approach outperforms state-of-the-art methods significantly.
基于评论的分层注意模型与随机反向转移训练的跨领域推荐
跨域推荐旨在利用源域中丰富的交互信息来预测冷启动用户与目标域中项目之间的交互。由于评论包含了用户的偏好和商品的属性,因此提出了许多基于评论的跨域推荐方法。然而,现有的方法无法1)从用户/项目的多个评论中选择重要的单词和评论,或者2)在没有足够重叠用户的情况下学习不同领域的统一表示空间。为了解决这些问题,我们提出了一种基于随机反向转移训练的分层注意力模型。具体来说,分层注意提取与给定用户或项目相关的文本信息,从而导致准确的交互预测。随机回传是一种数据增强算法,利用同一域的用户和项目数据,更好地匹配不同域的表示。在真实世界数据集上的大量实验表明,我们的方法明显优于最先进的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信