Item based recommendation using matrix-factorization-like embeddings from deep networks

Vaidyanath Areyur Shanthakumar, Clark Barnett, Keith Warnick, P. A. Sudyanti, Vitalii Gerbuz, Tathagata Mukherjee
{"title":"Item based recommendation using matrix-factorization-like embeddings from deep networks","authors":"Vaidyanath Areyur Shanthakumar, Clark Barnett, Keith Warnick, P. A. Sudyanti, Vitalii Gerbuz, Tathagata Mukherjee","doi":"10.1145/3409334.3452041","DOIUrl":null,"url":null,"abstract":"In this paper we describe a method for computing item based recommendations using matrix-factorization-like embeddings of the items computed using a neural network. Matrix factorizations (MF) compute near optimal item embeddings by minimizing a loss that measures the discrepancy between the predicted and known values of a sparse user-item rating matrix. Though useful for recommendation tasks, they are computationally intensive and hard to compute for large sets of users and items. Hence there is need to compute MF-like embeddings using other less computationally intensive methods, which can be substituted for the actual ones. In this work we explore the possibility of doing the same using a deep neural network (DNN). Our network is trained to learn matrix-factorization-like embeddings from easy to compute natural language processing (NLP) based semantic embeddings. The resulting MF-like embeddings are used to compute recommendations using an anonymized user product engagement dataset from the online retail company Overstock.com. We present the results of using our embeddings for computing recommendations with the Overstock.com production dataset consisting of ~3.5 million items and ~6 million users. Recommendations from Overstock.com's own recommendation system is compared against those obtained by using our MF-like embeddings, by comparing the results from both to the ground truth, which in our case is actual user co-clicks data. Our results show that it is possible to use DNNs for efficiently computing MF-like embeddings which can then be used in conjunction with the NLP based embeddings to improve the recommendations obtained from the NLP based embeddings.","PeriodicalId":148741,"journal":{"name":"Proceedings of the 2021 ACM Southeast Conference","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2021 ACM Southeast Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3409334.3452041","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

In this paper we describe a method for computing item based recommendations using matrix-factorization-like embeddings of the items computed using a neural network. Matrix factorizations (MF) compute near optimal item embeddings by minimizing a loss that measures the discrepancy between the predicted and known values of a sparse user-item rating matrix. Though useful for recommendation tasks, they are computationally intensive and hard to compute for large sets of users and items. Hence there is need to compute MF-like embeddings using other less computationally intensive methods, which can be substituted for the actual ones. In this work we explore the possibility of doing the same using a deep neural network (DNN). Our network is trained to learn matrix-factorization-like embeddings from easy to compute natural language processing (NLP) based semantic embeddings. The resulting MF-like embeddings are used to compute recommendations using an anonymized user product engagement dataset from the online retail company Overstock.com. We present the results of using our embeddings for computing recommendations with the Overstock.com production dataset consisting of ~3.5 million items and ~6 million users. Recommendations from Overstock.com's own recommendation system is compared against those obtained by using our MF-like embeddings, by comparing the results from both to the ground truth, which in our case is actual user co-clicks data. Our results show that it is possible to use DNNs for efficiently computing MF-like embeddings which can then be used in conjunction with the NLP based embeddings to improve the recommendations obtained from the NLP based embeddings.
基于项目的推荐,使用来自深度网络的类似矩阵分解的嵌入
在本文中,我们描述了一种使用神经网络计算项目的矩阵分解嵌入来计算基于项目的推荐的方法。矩阵分解(MF)通过最小化损失来计算接近最优的项目嵌入,该损失测量稀疏用户-项目评级矩阵的预测值和已知值之间的差异。虽然对于推荐任务很有用,但它们是计算密集型的,很难计算大量的用户和项目。因此,需要使用其他计算强度较小的方法来计算类mf嵌入,这些方法可以替代实际的嵌入。在这项工作中,我们探索了使用深度神经网络(DNN)做同样事情的可能性。我们的网络被训练从易于计算的基于自然语言处理(NLP)的语义嵌入中学习类似矩阵分解的嵌入。由此产生的类似mf的嵌入用于使用来自在线零售公司Overstock.com的匿名用户产品参与数据集计算推荐。我们展示了使用我们的嵌入对Overstock.com生产数据集进行计算推荐的结果,该数据集包含约350万件商品和约600万用户。来自Overstock.com自己的推荐系统的推荐与使用我们的类mf嵌入获得的推荐进行比较,通过将两者的结果与基本事实进行比较,在我们的例子中,这是实际的用户共同点击数据。我们的研究结果表明,可以使用dnn来有效地计算类mf嵌入,然后可以与基于NLP的嵌入结合使用,以改进从基于NLP的嵌入中获得的推荐。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信