基于特征空间再学习的增量矩阵分解推荐系统

Qiang Song, Jian Cheng, Hanqing Lu
{"title":"基于特征空间再学习的增量矩阵分解推荐系统","authors":"Qiang Song, Jian Cheng, Hanqing Lu","doi":"10.1145/2792838.2799668","DOIUrl":null,"url":null,"abstract":"Matrix factorization is widely used in Recommender Systems. Although existing popular incremental matrix factorization methods are effectively in reducing time complexity, they simply assume that the similarity between items or users is invariant. For instance, they keep the item feature matrix unchanged and just update the user matrix without re-training the entire model. However, with the new users growing continuously, the fitting error would be accumulated since the extra distribution information of items has not been utilized. In this paper, we present an alternative and reasonable approach, with a relaxed assumption that the similarity between items (users) is relatively stable after updating. Concretely, utilizing the prediction error of the new data as the auxiliary features, our method updates both feature matrices simultaneously, and thus users' preference can be better modeled than merely adjusting one corresponded feature matrix. Besides, our method maintains the feature dimension in a smaller size through taking advantage of matrix sketching. Experimental results show that our proposal outperforms the existing incremental matrix factorization methods.","PeriodicalId":325637,"journal":{"name":"Proceedings of the 9th ACM Conference on Recommender Systems","volume":"1981 2","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"20","resultStr":"{\"title\":\"Incremental Matrix Factorization via Feature Space Re-learning for Recommender System\",\"authors\":\"Qiang Song, Jian Cheng, Hanqing Lu\",\"doi\":\"10.1145/2792838.2799668\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Matrix factorization is widely used in Recommender Systems. Although existing popular incremental matrix factorization methods are effectively in reducing time complexity, they simply assume that the similarity between items or users is invariant. For instance, they keep the item feature matrix unchanged and just update the user matrix without re-training the entire model. However, with the new users growing continuously, the fitting error would be accumulated since the extra distribution information of items has not been utilized. In this paper, we present an alternative and reasonable approach, with a relaxed assumption that the similarity between items (users) is relatively stable after updating. Concretely, utilizing the prediction error of the new data as the auxiliary features, our method updates both feature matrices simultaneously, and thus users' preference can be better modeled than merely adjusting one corresponded feature matrix. Besides, our method maintains the feature dimension in a smaller size through taking advantage of matrix sketching. Experimental results show that our proposal outperforms the existing incremental matrix factorization methods.\",\"PeriodicalId\":325637,\"journal\":{\"name\":\"Proceedings of the 9th ACM Conference on Recommender Systems\",\"volume\":\"1981 2\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-09-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"20\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 9th ACM Conference on Recommender Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2792838.2799668\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 9th ACM Conference on Recommender Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2792838.2799668","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 20

摘要

矩阵分解在推荐系统中得到了广泛的应用。虽然现有流行的增量矩阵分解方法可以有效地降低时间复杂度,但它们简单地假设物品或用户之间的相似性是不变的。例如,他们保持项目特征矩阵不变,只是更新用户矩阵,而不重新训练整个模型。然而,随着新用户的不断增加,由于没有利用额外的物品分布信息,拟合误差会不断累积。在本文中,我们提出了一种替代和合理的方法,并放宽了项目(用户)之间的相似性在更新后相对稳定的假设。具体来说,我们的方法利用新数据的预测误差作为辅助特征,同时更新两个特征矩阵,从而比仅仅调整一个对应的特征矩阵更好地建模用户的偏好。此外,我们的方法利用矩阵素描的优势,将特征维度保持在较小的尺寸上。实验结果表明,该方法优于现有的增量矩阵分解方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Incremental Matrix Factorization via Feature Space Re-learning for Recommender System
Matrix factorization is widely used in Recommender Systems. Although existing popular incremental matrix factorization methods are effectively in reducing time complexity, they simply assume that the similarity between items or users is invariant. For instance, they keep the item feature matrix unchanged and just update the user matrix without re-training the entire model. However, with the new users growing continuously, the fitting error would be accumulated since the extra distribution information of items has not been utilized. In this paper, we present an alternative and reasonable approach, with a relaxed assumption that the similarity between items (users) is relatively stable after updating. Concretely, utilizing the prediction error of the new data as the auxiliary features, our method updates both feature matrices simultaneously, and thus users' preference can be better modeled than merely adjusting one corresponded feature matrix. Besides, our method maintains the feature dimension in a smaller size through taking advantage of matrix sketching. Experimental results show that our proposal outperforms the existing incremental matrix factorization methods.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信