Xinxin Yang, Xinwei Li, Zhen Liu, Yafan Yuan, Yannan Wang
{"title":"Multi-teacher knowledge distillation for debiasing recommendation with uniform data","authors":"Xinxin Yang, Xinwei Li, Zhen Liu, Yafan Yuan, Yannan Wang","doi":"10.1016/j.eswa.2025.126808","DOIUrl":null,"url":null,"abstract":"<div><div>Recent studies have highlighted the bias problem in recommender systems which affects the learning of users’ true preferences. One significant reason for bias is that the training data is missing not at random (MNAR). While existing approaches have demonstrated the usefulness of uniform data that is missing at random (MAR) for debiasing, the current models lack a comprehensive exploration of unbiased features within uniform data. Considering the valuableness and limited size of uniform data, this paper proposes a multi-teacher knowledge distillation framework (UKDRec) to extract and transfer more unbiased information from uniform data. The proposed framework consists of two components: a label-based teacher model that leverages supervision signals and a feature-based teacher model that facilitates the transfer of comprehensive unbiased features. To effectively extract unbiased features, we introduce a contrastive learning strategy that combines the uniform data with control data. The framework is trained using a multi-task learning approach, which enhances the transfer of unbiased knowledge. Extensive experiments conducted on real-world datasets demonstrate the superior debiasing performance of our approach compared to competitive baselines.</div></div>","PeriodicalId":50461,"journal":{"name":"Expert Systems with Applications","volume":"273 ","pages":"Article 126808"},"PeriodicalIF":7.5000,"publicationDate":"2025-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Expert Systems with Applications","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0957417425004300","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Recent studies have highlighted the bias problem in recommender systems which affects the learning of users’ true preferences. One significant reason for bias is that the training data is missing not at random (MNAR). While existing approaches have demonstrated the usefulness of uniform data that is missing at random (MAR) for debiasing, the current models lack a comprehensive exploration of unbiased features within uniform data. Considering the valuableness and limited size of uniform data, this paper proposes a multi-teacher knowledge distillation framework (UKDRec) to extract and transfer more unbiased information from uniform data. The proposed framework consists of two components: a label-based teacher model that leverages supervision signals and a feature-based teacher model that facilitates the transfer of comprehensive unbiased features. To effectively extract unbiased features, we introduce a contrastive learning strategy that combines the uniform data with control data. The framework is trained using a multi-task learning approach, which enhances the transfer of unbiased knowledge. Extensive experiments conducted on real-world datasets demonstrate the superior debiasing performance of our approach compared to competitive baselines.
期刊介绍:
Expert Systems With Applications is an international journal dedicated to the exchange of information on expert and intelligent systems used globally in industry, government, and universities. The journal emphasizes original papers covering the design, development, testing, implementation, and management of these systems, offering practical guidelines. It spans various sectors such as finance, engineering, marketing, law, project management, information management, medicine, and more. The journal also welcomes papers on multi-agent systems, knowledge management, neural networks, knowledge discovery, data mining, and other related areas, excluding applications to military/defense systems.