Chuanyun Xu, Yu Zheng, Yang Zhang, Chengjie Sun, Gang Li, Zhaohan Zhu
{"title":"基于重加权的自适应类平衡损失","authors":"Chuanyun Xu, Yu Zheng, Yang Zhang, Chengjie Sun, Gang Li, Zhaohan Zhu","doi":"10.1109/ACAIT56212.2022.10137858","DOIUrl":null,"url":null,"abstract":"As real-world data grows fast, the problem of data imbalance has become more prominent. Thus the long-tail problem in deep learning has received lots of attention recently. One of the solutions is to apply a class rebalancing strategy, such as directly using the inverse of the class sample size for reweighting. In past studies, the setting of weights only relates to the number of class samples. Only relying on the information of the number of class samples to determine the size of the weight is very crude in the sensitive method of re-weighting. In this paper, we implement adaptive re-weighting for three essential attributes of the dataset considering several factors: the number of classes, the number of samples, and the degree of class imbalance. We conducted experiments on the commonly used sample imbalance problem solution and proposed a new sample reweighting method. Specifically, a novel re-weighting idea is proposed to optimize Class-Balanced Loss Based on an Effective Number of Samples. Experiments show that the method is superior in re-weighting imbalanced datasets on deep neural networks. We hope our work will stimulate a rethinking of the number-of-samples-based convention in re-weighting.","PeriodicalId":398228,"journal":{"name":"2022 6th Asian Conference on Artificial Intelligence Technology (ACAIT)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Adaptive Class-Balanced Loss Based on Re-Weighting\",\"authors\":\"Chuanyun Xu, Yu Zheng, Yang Zhang, Chengjie Sun, Gang Li, Zhaohan Zhu\",\"doi\":\"10.1109/ACAIT56212.2022.10137858\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As real-world data grows fast, the problem of data imbalance has become more prominent. Thus the long-tail problem in deep learning has received lots of attention recently. One of the solutions is to apply a class rebalancing strategy, such as directly using the inverse of the class sample size for reweighting. In past studies, the setting of weights only relates to the number of class samples. Only relying on the information of the number of class samples to determine the size of the weight is very crude in the sensitive method of re-weighting. In this paper, we implement adaptive re-weighting for three essential attributes of the dataset considering several factors: the number of classes, the number of samples, and the degree of class imbalance. We conducted experiments on the commonly used sample imbalance problem solution and proposed a new sample reweighting method. Specifically, a novel re-weighting idea is proposed to optimize Class-Balanced Loss Based on an Effective Number of Samples. Experiments show that the method is superior in re-weighting imbalanced datasets on deep neural networks. We hope our work will stimulate a rethinking of the number-of-samples-based convention in re-weighting.\",\"PeriodicalId\":398228,\"journal\":{\"name\":\"2022 6th Asian Conference on Artificial Intelligence Technology (ACAIT)\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 6th Asian Conference on Artificial Intelligence Technology (ACAIT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ACAIT56212.2022.10137858\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 6th Asian Conference on Artificial Intelligence Technology (ACAIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACAIT56212.2022.10137858","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Adaptive Class-Balanced Loss Based on Re-Weighting
As real-world data grows fast, the problem of data imbalance has become more prominent. Thus the long-tail problem in deep learning has received lots of attention recently. One of the solutions is to apply a class rebalancing strategy, such as directly using the inverse of the class sample size for reweighting. In past studies, the setting of weights only relates to the number of class samples. Only relying on the information of the number of class samples to determine the size of the weight is very crude in the sensitive method of re-weighting. In this paper, we implement adaptive re-weighting for three essential attributes of the dataset considering several factors: the number of classes, the number of samples, and the degree of class imbalance. We conducted experiments on the commonly used sample imbalance problem solution and proposed a new sample reweighting method. Specifically, a novel re-weighting idea is proposed to optimize Class-Balanced Loss Based on an Effective Number of Samples. Experiments show that the method is superior in re-weighting imbalanced datasets on deep neural networks. We hope our work will stimulate a rethinking of the number-of-samples-based convention in re-weighting.