An Entire Space Multi-gate Mixture-of-Experts Model for Recommender Systems

Zheng Ye, Jun Ge
{"title":"An Entire Space Multi-gate Mixture-of-Experts Model for Recommender Systems","authors":"Zheng Ye, Jun Ge","doi":"10.1109/WI-IAT55865.2022.00047","DOIUrl":null,"url":null,"abstract":"With the development of e-commerce, both advertisers and platforms pay more and more attention to the effectiveness of ads recommendation. In recent years, deep learning approaches with a mulit-task learning framework have shown to be effective in such recommendation systems. One main goal of these systems is to estimate the post-click conversion rate(CVR) accurately. However, higher click-through rate(CTR) for a product does not always lead to higher conversion rate(CVR) due to many reasons (e.g. lower rating). In addition, the overall performance of the recommendation system may not be optimal, since the usage of multi-task models (the CTR and CVR tasks) is often sensitive to the relationships of the tasks. In this paper, we propose a deep neural model under the Mixture-of-Experts framework (MoE), call ES-MMOE, in which a sub-network is used to promote samples with high CVR. The model can also be trained with the entire space by taking advantage of the Entire Space Multi-task Model (ESMM) model. Extensive experiments on a large-scale dataset gathered from traffic logs of Taobao’s recommender system demonstrate that ES-MMOE outperforms a number of the state-of-the-art models, including ESMM, with a relatively large margin.","PeriodicalId":345445,"journal":{"name":"2022 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WI-IAT55865.2022.00047","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

With the development of e-commerce, both advertisers and platforms pay more and more attention to the effectiveness of ads recommendation. In recent years, deep learning approaches with a mulit-task learning framework have shown to be effective in such recommendation systems. One main goal of these systems is to estimate the post-click conversion rate(CVR) accurately. However, higher click-through rate(CTR) for a product does not always lead to higher conversion rate(CVR) due to many reasons (e.g. lower rating). In addition, the overall performance of the recommendation system may not be optimal, since the usage of multi-task models (the CTR and CVR tasks) is often sensitive to the relationships of the tasks. In this paper, we propose a deep neural model under the Mixture-of-Experts framework (MoE), call ES-MMOE, in which a sub-network is used to promote samples with high CVR. The model can also be trained with the entire space by taking advantage of the Entire Space Multi-task Model (ESMM) model. Extensive experiments on a large-scale dataset gathered from traffic logs of Taobao’s recommender system demonstrate that ES-MMOE outperforms a number of the state-of-the-art models, including ESMM, with a relatively large margin.
推荐系统的全空间多门专家混合模型
随着电子商务的发展,广告推荐的有效性越来越受到广告主和平台的重视。近年来,具有多任务学习框架的深度学习方法已被证明在此类推荐系统中是有效的。这些系统的一个主要目标是准确估计点击后转化率(CVR)。然而,由于许多原因(例如较低的评级),较高的产品点击率(CTR)并不总是带来较高的转化率(CVR)。此外,推荐系统的整体性能可能不是最优的,因为多任务模型(CTR和CVR任务)的使用通常对任务之间的关系很敏感。本文提出了一种专家混合框架下的深度神经网络模型ES-MMOE,该模型利用子网络提升CVR较高的样本。利用全空间多任务模型(ESMM)模型,也可以对整个空间进行训练。在从淘宝推荐系统的流量日志中收集的大规模数据集上进行的大量实验表明,ES-MMOE以相对较大的优势优于包括ESMM在内的许多最先进的模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信