{"title":"不平衡企业信用评级的对比预训练","authors":"Bojing Feng, Wenfang Xue","doi":"10.1145/3529836.3529911","DOIUrl":null,"url":null,"abstract":"The corporate credit rating reflects the level of corporate credit and plays a crucial role in modern financial risk control. But real-world credit rating data usually shows long-tail distributions, which means a heavy class imbalanced problem challenging the corporate credit rating system greatly. To tackle that, inspired by the recent advances of pre-train techniques in self-supervised representation learning, we propose a novel framework named Contrastive Pre-training for Corporate Credit Rating (CP4CCR), which utilizes the self-supervision for getting over the class imbalance. Specifically, we propose to, in the first phase, exert contrastive self-supervised pre-training without label information, which aims to learn a better class-agnostic initialization. Furthermore, two self-supervised tasks are developed within CP4CCR: (i) Feature Masking (FM) and (ii) Feature Swapping(FS). In the second phase, we can train any standard corporate credit rating model initialized by the pre-trained network. Extensive experiments conducted on the real public-listed corporate rating dataset, prove that CP4CCR can improve the performance of standard corporate credit rating models, especially for the class with few samples.","PeriodicalId":285191,"journal":{"name":"2022 14th International Conference on Machine Learning and Computing (ICMLC)","volume":"59 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-02-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Contrastive Pre-training for Imbalanced Corporate Credit Ratings\",\"authors\":\"Bojing Feng, Wenfang Xue\",\"doi\":\"10.1145/3529836.3529911\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The corporate credit rating reflects the level of corporate credit and plays a crucial role in modern financial risk control. But real-world credit rating data usually shows long-tail distributions, which means a heavy class imbalanced problem challenging the corporate credit rating system greatly. To tackle that, inspired by the recent advances of pre-train techniques in self-supervised representation learning, we propose a novel framework named Contrastive Pre-training for Corporate Credit Rating (CP4CCR), which utilizes the self-supervision for getting over the class imbalance. Specifically, we propose to, in the first phase, exert contrastive self-supervised pre-training without label information, which aims to learn a better class-agnostic initialization. Furthermore, two self-supervised tasks are developed within CP4CCR: (i) Feature Masking (FM) and (ii) Feature Swapping(FS). In the second phase, we can train any standard corporate credit rating model initialized by the pre-trained network. Extensive experiments conducted on the real public-listed corporate rating dataset, prove that CP4CCR can improve the performance of standard corporate credit rating models, especially for the class with few samples.\",\"PeriodicalId\":285191,\"journal\":{\"name\":\"2022 14th International Conference on Machine Learning and Computing (ICMLC)\",\"volume\":\"59 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-02-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 14th International Conference on Machine Learning and Computing (ICMLC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3529836.3529911\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 14th International Conference on Machine Learning and Computing (ICMLC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3529836.3529911","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Contrastive Pre-training for Imbalanced Corporate Credit Ratings
The corporate credit rating reflects the level of corporate credit and plays a crucial role in modern financial risk control. But real-world credit rating data usually shows long-tail distributions, which means a heavy class imbalanced problem challenging the corporate credit rating system greatly. To tackle that, inspired by the recent advances of pre-train techniques in self-supervised representation learning, we propose a novel framework named Contrastive Pre-training for Corporate Credit Rating (CP4CCR), which utilizes the self-supervision for getting over the class imbalance. Specifically, we propose to, in the first phase, exert contrastive self-supervised pre-training without label information, which aims to learn a better class-agnostic initialization. Furthermore, two self-supervised tasks are developed within CP4CCR: (i) Feature Masking (FM) and (ii) Feature Swapping(FS). In the second phase, we can train any standard corporate credit rating model initialized by the pre-trained network. Extensive experiments conducted on the real public-listed corporate rating dataset, prove that CP4CCR can improve the performance of standard corporate credit rating models, especially for the class with few samples.