Chao Tan , Sheng Chen , Jiaxi Zhang , Zilong Xu , Xin Geng , Genlin Ji
{"title":"RG4LDL:标签分布学习的重整化组","authors":"Chao Tan , Sheng Chen , Jiaxi Zhang , Zilong Xu , Xin Geng , Genlin Ji","doi":"10.1016/j.knosys.2025.113666","DOIUrl":null,"url":null,"abstract":"<div><div>Label distribution learning (LDL) is an effective paradigm to address label ambiguity by modeling the relevance of multiple labels to an instance. However, existing LDL methods suffer from challenges such as high model complexity, slow convergence, and limited availability of label distribution-annotated training data. To tackle these issues, we propose RG4LDL, a novel framework that integrates the renormalization group (RG) principle with LDL for the first time. RG4LDL employs a restricted Boltzmann machine (RBM)-based neural network to iteratively extract relevant degrees of freedom, thereby optimizing feature learning and improving predictive accuracy. By combining unsupervised RG learning and supervised LDL prediction in an end-to-end manner, RG4LDL achieves both efficiency and effectiveness. Experimental results on 13 real-world datasets and a synthetic toy dataset demonstrate that RG4LDL significantly outperforms state-of-the-art LDL methods in terms of predictive accuracy and computational efficiency. These results highlight the potential of RG4LDL as a benchmark solution for label distribution learning tasks.</div></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":"320 ","pages":"Article 113666"},"PeriodicalIF":7.6000,"publicationDate":"2025-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"RG4LDL: Renormalization group for label distribution learning\",\"authors\":\"Chao Tan , Sheng Chen , Jiaxi Zhang , Zilong Xu , Xin Geng , Genlin Ji\",\"doi\":\"10.1016/j.knosys.2025.113666\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Label distribution learning (LDL) is an effective paradigm to address label ambiguity by modeling the relevance of multiple labels to an instance. However, existing LDL methods suffer from challenges such as high model complexity, slow convergence, and limited availability of label distribution-annotated training data. To tackle these issues, we propose RG4LDL, a novel framework that integrates the renormalization group (RG) principle with LDL for the first time. RG4LDL employs a restricted Boltzmann machine (RBM)-based neural network to iteratively extract relevant degrees of freedom, thereby optimizing feature learning and improving predictive accuracy. By combining unsupervised RG learning and supervised LDL prediction in an end-to-end manner, RG4LDL achieves both efficiency and effectiveness. Experimental results on 13 real-world datasets and a synthetic toy dataset demonstrate that RG4LDL significantly outperforms state-of-the-art LDL methods in terms of predictive accuracy and computational efficiency. These results highlight the potential of RG4LDL as a benchmark solution for label distribution learning tasks.</div></div>\",\"PeriodicalId\":49939,\"journal\":{\"name\":\"Knowledge-Based Systems\",\"volume\":\"320 \",\"pages\":\"Article 113666\"},\"PeriodicalIF\":7.6000,\"publicationDate\":\"2025-05-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Knowledge-Based Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0950705125007129\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950705125007129","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
RG4LDL: Renormalization group for label distribution learning
Label distribution learning (LDL) is an effective paradigm to address label ambiguity by modeling the relevance of multiple labels to an instance. However, existing LDL methods suffer from challenges such as high model complexity, slow convergence, and limited availability of label distribution-annotated training data. To tackle these issues, we propose RG4LDL, a novel framework that integrates the renormalization group (RG) principle with LDL for the first time. RG4LDL employs a restricted Boltzmann machine (RBM)-based neural network to iteratively extract relevant degrees of freedom, thereby optimizing feature learning and improving predictive accuracy. By combining unsupervised RG learning and supervised LDL prediction in an end-to-end manner, RG4LDL achieves both efficiency and effectiveness. Experimental results on 13 real-world datasets and a synthetic toy dataset demonstrate that RG4LDL significantly outperforms state-of-the-art LDL methods in terms of predictive accuracy and computational efficiency. These results highlight the potential of RG4LDL as a benchmark solution for label distribution learning tasks.
期刊介绍:
Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.