Accelerating recommender system training 15x with RAPIDS

Sara Rabhi, Wenbo Sun, Julio Perez, M. R. B. Kristensen, Jiwei Liu, Even Oldridge
{"title":"Accelerating recommender system training 15x with RAPIDS","authors":"Sara Rabhi, Wenbo Sun, Julio Perez, M. R. B. Kristensen, Jiwei Liu, Even Oldridge","doi":"10.1145/3359555.3359564","DOIUrl":null,"url":null,"abstract":"In this paper we present the novel aspects of our 15th place solution to the RecSys Challenge 2019 which are focused on the acceleration of feature generation and model training time. In our final solution we sped up training of our model by a factor of 15.6x, from a workflow of 891.8s (14m52s) to 57.2s, through a combination of the RAPIDS.AI cuDF library for preprocessing, a custom batch dataloader, LAMB and extreme batch sizes, and an update to the kernel responsible for calculating the embedding gradient in PyTorch. Using cuDF we also accelerated our feature generation by a factor of 9.7x by performing the computations on the GPU, reducing the time taken to generate the features used in our model from 51 minutes to 5. We demonstrate these optimizations on the fastai tabular model which we relied on extensively in our final ensemble. With training time so drastically reduced the iteration involved in generating new features and training new models is much more fluid, allowing for the rapid prototyping of deep learning based recommender systems in hours as opposed to days.","PeriodicalId":255213,"journal":{"name":"Proceedings of the Workshop on ACM Recommender Systems Challenge","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Workshop on ACM Recommender Systems Challenge","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3359555.3359564","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

In this paper we present the novel aspects of our 15th place solution to the RecSys Challenge 2019 which are focused on the acceleration of feature generation and model training time. In our final solution we sped up training of our model by a factor of 15.6x, from a workflow of 891.8s (14m52s) to 57.2s, through a combination of the RAPIDS.AI cuDF library for preprocessing, a custom batch dataloader, LAMB and extreme batch sizes, and an update to the kernel responsible for calculating the embedding gradient in PyTorch. Using cuDF we also accelerated our feature generation by a factor of 9.7x by performing the computations on the GPU, reducing the time taken to generate the features used in our model from 51 minutes to 5. We demonstrate these optimizations on the fastai tabular model which we relied on extensively in our final ensemble. With training time so drastically reduced the iteration involved in generating new features and training new models is much more fluid, allowing for the rapid prototyping of deep learning based recommender systems in hours as opposed to days.
使用RAPIDS加速推荐系统训练15倍
在本文中,我们介绍了2019年RecSys挑战赛第15名解决方案的新颖方面,重点是加速特征生成和模型训练时间。在我们的最终解决方案中,通过RAPIDS的组合,我们将模型的训练速度提高了15.6倍,从891.8秒(14m52秒)的工作流程提高到57.2秒。用于预处理的AI cuDF库,自定义批处理数据加载器,LAMB和极端批处理大小,以及负责计算PyTorch中嵌入梯度的内核更新。使用cuDF,我们还通过在GPU上执行计算,将特征生成速度提高了9.7倍,将模型中使用的特征生成时间从51分钟减少到5分钟。我们在fastai表格模型上演示了这些优化,我们在最终的集成中广泛依赖该模型。随着训练时间的大幅减少,生成新特征和训练新模型所涉及的迭代变得更加流畅,允许在数小时内快速构建基于深度学习的推荐系统原型,而不是几天。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信