面向顺序推荐的互感器双向编码器表示研究

Amine Kheldouni, J. Boumhidi
{"title":"面向顺序推荐的互感器双向编码器表示研究","authors":"Amine Kheldouni, J. Boumhidi","doi":"10.1109/ISCV54655.2022.9806062","DOIUrl":null,"url":null,"abstract":"Sequential recommender systems seek to capture information about user affinities and behaviors considering their sequential series of interactions. While former models based on Markov Chains and Recurrent Neural Networks were used to model future interactions with the last interaction, sequential recommendation aspires to express more than that. In this paper, we detail BERT4Rec, a sequential recommendation approach, based on bidirectional encoder of self-attention based Transformer mechanisms. Such model, inspired by recent advances in NLP, learns to predict a masked item in a user’s sequence using the Cloze objective. The model unmasks the randomly masked items by capturing attention states from both left and right sides and short-term and long-term contexts. Such a sequential model has been applied in what follows to new datasets from Amazon reviews of different characteristics. Thus, our main contribution consists of testing this approach and its limits on several recommendation datasets.","PeriodicalId":426665,"journal":{"name":"2022 International Conference on Intelligent Systems and Computer Vision (ISCV)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Study of Bidirectional Encoder Representations from Transformers for Sequential Recommendations\",\"authors\":\"Amine Kheldouni, J. Boumhidi\",\"doi\":\"10.1109/ISCV54655.2022.9806062\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Sequential recommender systems seek to capture information about user affinities and behaviors considering their sequential series of interactions. While former models based on Markov Chains and Recurrent Neural Networks were used to model future interactions with the last interaction, sequential recommendation aspires to express more than that. In this paper, we detail BERT4Rec, a sequential recommendation approach, based on bidirectional encoder of self-attention based Transformer mechanisms. Such model, inspired by recent advances in NLP, learns to predict a masked item in a user’s sequence using the Cloze objective. The model unmasks the randomly masked items by capturing attention states from both left and right sides and short-term and long-term contexts. Such a sequential model has been applied in what follows to new datasets from Amazon reviews of different characteristics. Thus, our main contribution consists of testing this approach and its limits on several recommendation datasets.\",\"PeriodicalId\":426665,\"journal\":{\"name\":\"2022 International Conference on Intelligent Systems and Computer Vision (ISCV)\",\"volume\":\"22 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-05-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 International Conference on Intelligent Systems and Computer Vision (ISCV)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISCV54655.2022.9806062\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Intelligent Systems and Computer Vision (ISCV)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISCV54655.2022.9806062","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

顺序推荐系统寻求获取有关用户亲缘关系和行为的信息,考虑到他们的一系列顺序交互。以前基于马尔可夫链和递归神经网络的模型被用来模拟与最后一次交互的未来交互,而顺序推荐渴望表达更多的东西。在本文中,我们详细介绍了BERT4Rec,一种基于双向编码器的基于自关注的Transformer机制的顺序推荐方法。这种模型受到NLP最新进展的启发,学习使用完形填空目标来预测用户序列中被掩盖的项目。该模型通过捕捉左右两侧以及短期和长期上下文的注意状态来揭示随机被掩盖的项目。这样的顺序模型已经应用于以下来自亚马逊不同特征评论的新数据集。因此,我们的主要贡献包括在几个推荐数据集上测试这种方法及其局限性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Study of Bidirectional Encoder Representations from Transformers for Sequential Recommendations
Sequential recommender systems seek to capture information about user affinities and behaviors considering their sequential series of interactions. While former models based on Markov Chains and Recurrent Neural Networks were used to model future interactions with the last interaction, sequential recommendation aspires to express more than that. In this paper, we detail BERT4Rec, a sequential recommendation approach, based on bidirectional encoder of self-attention based Transformer mechanisms. Such model, inspired by recent advances in NLP, learns to predict a masked item in a user’s sequence using the Cloze objective. The model unmasks the randomly masked items by capturing attention states from both left and right sides and short-term and long-term contexts. Such a sequential model has been applied in what follows to new datasets from Amazon reviews of different characteristics. Thus, our main contribution consists of testing this approach and its limits on several recommendation datasets.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信