{"title":"TBformer:多模态客户流失预测的多尺度时间行为关注变压器","authors":"Yushi Li;Yunfei Tao;Ming Zhu;Ziwen Chen;Zhenyu Wen;Bideng Zhu","doi":"10.1109/TCE.2025.3563905","DOIUrl":null,"url":null,"abstract":"In highly competitive market of Internet service platforms, identifying and retaining potential churners through customer churn prediction techniques is crucial for maintaining platform vitality. The sequences of interaction behaviors between customers and platforms are closely related to churn prediction results. However, existing methods focus only on capturing the temporal dependencies in dynamic behavior sequences while ignoring the correlations between different behaviors. Moreover, classical methods apply only to static data, while deep learning-based methods focus on dynamic data, neither leveraging the complementary information between static and dynamic data. To address these issues, we propose a multi-modal customer churn prediction model based on Transformer with multi-scale Time-Behavior attention, TBformer, which adaptively fuses static and dynamic data. Time-Behavior module can capture multi-scale temporal dependencies and behavioral correlations in behavioral time series across time and behavior dimensions. We perform behavior-independent multi-scale dynamic feature fusion through bidirectional connection paths. Furthermore, the multi-modal fusion module based on the attention mechanism adaptively controls the fusion weights of static and dynamic features to improve performance. Extensive experiments on two publicly available datasets, KKBox and KDD, and a private dataset, HOF, demonstrate that our TBformer achieves an average AUC of 91.2% (+2.47%), outperforming the state-of-the-art customer churn prediction methods.","PeriodicalId":13208,"journal":{"name":"IEEE Transactions on Consumer Electronics","volume":"71 2","pages":"3192-3203"},"PeriodicalIF":10.9000,"publicationDate":"2025-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"TBformer: Multi-Scale Transformer With Time-Behavior Attention for Multi-Modal Customer Churn Prediction\",\"authors\":\"Yushi Li;Yunfei Tao;Ming Zhu;Ziwen Chen;Zhenyu Wen;Bideng Zhu\",\"doi\":\"10.1109/TCE.2025.3563905\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In highly competitive market of Internet service platforms, identifying and retaining potential churners through customer churn prediction techniques is crucial for maintaining platform vitality. The sequences of interaction behaviors between customers and platforms are closely related to churn prediction results. However, existing methods focus only on capturing the temporal dependencies in dynamic behavior sequences while ignoring the correlations between different behaviors. Moreover, classical methods apply only to static data, while deep learning-based methods focus on dynamic data, neither leveraging the complementary information between static and dynamic data. To address these issues, we propose a multi-modal customer churn prediction model based on Transformer with multi-scale Time-Behavior attention, TBformer, which adaptively fuses static and dynamic data. Time-Behavior module can capture multi-scale temporal dependencies and behavioral correlations in behavioral time series across time and behavior dimensions. We perform behavior-independent multi-scale dynamic feature fusion through bidirectional connection paths. Furthermore, the multi-modal fusion module based on the attention mechanism adaptively controls the fusion weights of static and dynamic features to improve performance. Extensive experiments on two publicly available datasets, KKBox and KDD, and a private dataset, HOF, demonstrate that our TBformer achieves an average AUC of 91.2% (+2.47%), outperforming the state-of-the-art customer churn prediction methods.\",\"PeriodicalId\":13208,\"journal\":{\"name\":\"IEEE Transactions on Consumer Electronics\",\"volume\":\"71 2\",\"pages\":\"3192-3203\"},\"PeriodicalIF\":10.9000,\"publicationDate\":\"2025-04-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Consumer Electronics\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10976249/\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Consumer Electronics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10976249/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
TBformer: Multi-Scale Transformer With Time-Behavior Attention for Multi-Modal Customer Churn Prediction
In highly competitive market of Internet service platforms, identifying and retaining potential churners through customer churn prediction techniques is crucial for maintaining platform vitality. The sequences of interaction behaviors between customers and platforms are closely related to churn prediction results. However, existing methods focus only on capturing the temporal dependencies in dynamic behavior sequences while ignoring the correlations between different behaviors. Moreover, classical methods apply only to static data, while deep learning-based methods focus on dynamic data, neither leveraging the complementary information between static and dynamic data. To address these issues, we propose a multi-modal customer churn prediction model based on Transformer with multi-scale Time-Behavior attention, TBformer, which adaptively fuses static and dynamic data. Time-Behavior module can capture multi-scale temporal dependencies and behavioral correlations in behavioral time series across time and behavior dimensions. We perform behavior-independent multi-scale dynamic feature fusion through bidirectional connection paths. Furthermore, the multi-modal fusion module based on the attention mechanism adaptively controls the fusion weights of static and dynamic features to improve performance. Extensive experiments on two publicly available datasets, KKBox and KDD, and a private dataset, HOF, demonstrate that our TBformer achieves an average AUC of 91.2% (+2.47%), outperforming the state-of-the-art customer churn prediction methods.
期刊介绍:
The main focus for the IEEE Transactions on Consumer Electronics is the engineering and research aspects of the theory, design, construction, manufacture or end use of mass market electronics, systems, software and services for consumers.