Yuao Zhang , Shuya Ke , Jing Li , Weihua Liu , Jueliang Hu , Kaixiang Yang
{"title":"DHR-BLS:胡贝尔式稳健广义学习系统及其分布式版本","authors":"Yuao Zhang , Shuya Ke , Jing Li , Weihua Liu , Jueliang Hu , Kaixiang Yang","doi":"10.1016/j.knosys.2025.113184","DOIUrl":null,"url":null,"abstract":"<div><div>The broad learning system (BLS) is a recently developed neural network framework recognized for its efficiency and effectiveness in handling high-dimensional data with a flat network architecture. However, traditional BLS models are highly sensitive to outliers and noisy data, which can significantly degrade performance. While incorporating the <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>1</mn></mrow></msub></math></span>-norm loss function enhances robustness against outliers, it often compromises performance on clean datasets. To address this limitation, we propose the Huber-type robust broad learning system (HR-BLS), which integrates the Huber loss function into BLS, effectively combining the strengths of both <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>1</mn></mrow></msub></math></span>-norm and <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>2</mn></mrow></msub></math></span>-norm loss functions to achieve balanced robustness against data anomalies. Moreover, the elastic-net regularization is included to simultaneously enhance model stability and promote sparsity. To effectively manage large-scale and distributed data, we extend HR-BLS by introducing the distributed Huber-type robust broad learning system (DHR-BLS). Given the non-differentiability of the <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>1</mn></mrow></msub></math></span>-norm, traditional gradient-based optimization methods are insufficient. Therefore, we adopt the alternating direction method of multipliers (ADMM) to train, ensuring convergence through the use of appropriate constraints. Experimental results on both synthetic and benchmark datasets show that HR-BLS outperforms traditional BLS and other state-of-the-art robust learning methods in terms of accuracy and robustness. Furthermore, DHR-BLS demonstrates exceptional scalability and effectiveness, making it suitable for distributed learning environments.</div></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":"314 ","pages":"Article 113184"},"PeriodicalIF":7.2000,"publicationDate":"2025-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"DHR-BLS: A Huber-type robust broad learning system with its distributed version\",\"authors\":\"Yuao Zhang , Shuya Ke , Jing Li , Weihua Liu , Jueliang Hu , Kaixiang Yang\",\"doi\":\"10.1016/j.knosys.2025.113184\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>The broad learning system (BLS) is a recently developed neural network framework recognized for its efficiency and effectiveness in handling high-dimensional data with a flat network architecture. However, traditional BLS models are highly sensitive to outliers and noisy data, which can significantly degrade performance. While incorporating the <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>1</mn></mrow></msub></math></span>-norm loss function enhances robustness against outliers, it often compromises performance on clean datasets. To address this limitation, we propose the Huber-type robust broad learning system (HR-BLS), which integrates the Huber loss function into BLS, effectively combining the strengths of both <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>1</mn></mrow></msub></math></span>-norm and <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>2</mn></mrow></msub></math></span>-norm loss functions to achieve balanced robustness against data anomalies. Moreover, the elastic-net regularization is included to simultaneously enhance model stability and promote sparsity. To effectively manage large-scale and distributed data, we extend HR-BLS by introducing the distributed Huber-type robust broad learning system (DHR-BLS). Given the non-differentiability of the <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>1</mn></mrow></msub></math></span>-norm, traditional gradient-based optimization methods are insufficient. Therefore, we adopt the alternating direction method of multipliers (ADMM) to train, ensuring convergence through the use of appropriate constraints. Experimental results on both synthetic and benchmark datasets show that HR-BLS outperforms traditional BLS and other state-of-the-art robust learning methods in terms of accuracy and robustness. Furthermore, DHR-BLS demonstrates exceptional scalability and effectiveness, making it suitable for distributed learning environments.</div></div>\",\"PeriodicalId\":49939,\"journal\":{\"name\":\"Knowledge-Based Systems\",\"volume\":\"314 \",\"pages\":\"Article 113184\"},\"PeriodicalIF\":7.2000,\"publicationDate\":\"2025-02-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Knowledge-Based Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S095070512500231X\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S095070512500231X","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
DHR-BLS: A Huber-type robust broad learning system with its distributed version
The broad learning system (BLS) is a recently developed neural network framework recognized for its efficiency and effectiveness in handling high-dimensional data with a flat network architecture. However, traditional BLS models are highly sensitive to outliers and noisy data, which can significantly degrade performance. While incorporating the -norm loss function enhances robustness against outliers, it often compromises performance on clean datasets. To address this limitation, we propose the Huber-type robust broad learning system (HR-BLS), which integrates the Huber loss function into BLS, effectively combining the strengths of both -norm and -norm loss functions to achieve balanced robustness against data anomalies. Moreover, the elastic-net regularization is included to simultaneously enhance model stability and promote sparsity. To effectively manage large-scale and distributed data, we extend HR-BLS by introducing the distributed Huber-type robust broad learning system (DHR-BLS). Given the non-differentiability of the -norm, traditional gradient-based optimization methods are insufficient. Therefore, we adopt the alternating direction method of multipliers (ADMM) to train, ensuring convergence through the use of appropriate constraints. Experimental results on both synthetic and benchmark datasets show that HR-BLS outperforms traditional BLS and other state-of-the-art robust learning methods in terms of accuracy and robustness. Furthermore, DHR-BLS demonstrates exceptional scalability and effectiveness, making it suitable for distributed learning environments.
期刊介绍:
Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.