A Study on Accuracy, Miscalibration, and Popularity Bias in Recommendations

Dominik Kowald, Gregory Mayr, M. Schedl, E. Lex
{"title":"A Study on Accuracy, Miscalibration, and Popularity Bias in Recommendations","authors":"Dominik Kowald, Gregory Mayr, M. Schedl, E. Lex","doi":"10.48550/arXiv.2303.00400","DOIUrl":null,"url":null,"abstract":"Recent research has suggested different metrics to measure the inconsistency of recommendation performance, including the accuracy difference between user groups, miscalibration, and popularity lift. However, a study that relates miscalibration and popularity lift to recommendation accuracy across different user groups is still missing. Additionally, it is unclear if particular genres contribute to the emergence of inconsistency in recommendation performance across user groups. In this paper, we present an analysis of these three aspects of five well-known recommendation algorithms for user groups that differ in their preference for popular content. Additionally, we study how different genres affect the inconsistency of recommendation performance, and how this is aligned with the popularity of the genres. Using data from LastFm, MovieLens, and MyAnimeList, we present two key findings. First, we find that users with little interest in popular content receive the worst recommendation accuracy, and that this is aligned with miscalibration and popularity lift. Second, our experiments show that particular genres contribute to a different extent to the inconsistency of recommendation performance, especially in terms of miscalibration in the case of the MyAnimeList dataset.","PeriodicalId":165601,"journal":{"name":"International Workshop on Algorithmic Bias in Search and Recommendation","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Workshop on Algorithmic Bias in Search and Recommendation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.48550/arXiv.2303.00400","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Recent research has suggested different metrics to measure the inconsistency of recommendation performance, including the accuracy difference between user groups, miscalibration, and popularity lift. However, a study that relates miscalibration and popularity lift to recommendation accuracy across different user groups is still missing. Additionally, it is unclear if particular genres contribute to the emergence of inconsistency in recommendation performance across user groups. In this paper, we present an analysis of these three aspects of five well-known recommendation algorithms for user groups that differ in their preference for popular content. Additionally, we study how different genres affect the inconsistency of recommendation performance, and how this is aligned with the popularity of the genres. Using data from LastFm, MovieLens, and MyAnimeList, we present two key findings. First, we find that users with little interest in popular content receive the worst recommendation accuracy, and that this is aligned with miscalibration and popularity lift. Second, our experiments show that particular genres contribute to a different extent to the inconsistency of recommendation performance, especially in terms of miscalibration in the case of the MyAnimeList dataset.
推荐中准确性、误差校准和流行偏差的研究
最近的研究提出了不同的指标来衡量推荐性能的不一致性,包括用户组之间的准确性差异、错误校准和流行度提升。然而,一项关于不同用户群体的错误校准和受欢迎程度提升与推荐准确性之间关系的研究仍然缺失。此外,还不清楚是否特定的类型导致了跨用户组推荐性能的不一致。在本文中,我们对五种知名的推荐算法的这三个方面进行了分析,这些算法针对的是用户群体对热门内容的偏好不同。此外,我们还研究了不同的类型如何影响推荐性能的不一致性,以及这种不一致性如何与类型的受欢迎程度相一致。利用LastFm、MovieLens和MyAnimeList的数据,我们提出了两个主要发现。首先,我们发现对流行内容不感兴趣的用户获得的推荐准确性最差,这与校准错误和流行度提升一致。其次,我们的实验表明,特定的类型对推荐性能的不一致性有不同程度的贡献,特别是在MyAnimeList数据集的情况下的错误校准方面。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信