Jiayin Zhang;Nan Wu;Tingting Zhang;Bin Li;Qinsiwei Yan;Xiaoli Ma
{"title":"平滑数据的分布式图学习:贝叶斯框架","authors":"Jiayin Zhang;Nan Wu;Tingting Zhang;Bin Li;Qinsiwei Yan;Xiaoli Ma","doi":"10.1109/TSP.2025.3553915","DOIUrl":null,"url":null,"abstract":"The emerging field of graph learning, which aims to learn reasonable graph structures from data, plays a vital role in Graph Signal Processing (GSP) and finds applications in various data processing domains. However, the existing approaches have primarily focused on learning deterministic graphs, and thus are not suitable for applications involving topological stochasticity, such as epidemiological models. In this paper, we develop a hierarchical Bayesian model for graph learning problem. Specifically, the generative model of smooth signals is formulated by transforming the graph topology into self-expressiveness coefficients and incorporating individual noise for each vertex. Tailored probability distributions are imposed on each edge to characterize the valid graph topology constraints along with edge-level probabilistic information. Building upon this, we derive the Bayesian Graph Learning (BGL) approach to efficiently estimate the graph structure in a distributed manner. In particular, based on the specific probabilistic dependencies, we derive a series of message passing rules by a mixture of Generalized Approximate Message Passing (GAMP) message and Belief Propagation (BP) message to iteratively approximate the posterior probabilities. Numerical experiments with both artificial and real data demonstrate that BGL learns more accurate graph structures and enhances machine learning tasks compared to state-of-the-art methods.","PeriodicalId":13330,"journal":{"name":"IEEE Transactions on Signal Processing","volume":"73 ","pages":"1626-1642"},"PeriodicalIF":4.6000,"publicationDate":"2025-03-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Distributed Graph Learning From Smooth Data: A Bayesian Framework\",\"authors\":\"Jiayin Zhang;Nan Wu;Tingting Zhang;Bin Li;Qinsiwei Yan;Xiaoli Ma\",\"doi\":\"10.1109/TSP.2025.3553915\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The emerging field of graph learning, which aims to learn reasonable graph structures from data, plays a vital role in Graph Signal Processing (GSP) and finds applications in various data processing domains. However, the existing approaches have primarily focused on learning deterministic graphs, and thus are not suitable for applications involving topological stochasticity, such as epidemiological models. In this paper, we develop a hierarchical Bayesian model for graph learning problem. Specifically, the generative model of smooth signals is formulated by transforming the graph topology into self-expressiveness coefficients and incorporating individual noise for each vertex. Tailored probability distributions are imposed on each edge to characterize the valid graph topology constraints along with edge-level probabilistic information. Building upon this, we derive the Bayesian Graph Learning (BGL) approach to efficiently estimate the graph structure in a distributed manner. In particular, based on the specific probabilistic dependencies, we derive a series of message passing rules by a mixture of Generalized Approximate Message Passing (GAMP) message and Belief Propagation (BP) message to iteratively approximate the posterior probabilities. Numerical experiments with both artificial and real data demonstrate that BGL learns more accurate graph structures and enhances machine learning tasks compared to state-of-the-art methods.\",\"PeriodicalId\":13330,\"journal\":{\"name\":\"IEEE Transactions on Signal Processing\",\"volume\":\"73 \",\"pages\":\"1626-1642\"},\"PeriodicalIF\":4.6000,\"publicationDate\":\"2025-03-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Signal Processing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10937371/\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Signal Processing","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10937371/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Distributed Graph Learning From Smooth Data: A Bayesian Framework
The emerging field of graph learning, which aims to learn reasonable graph structures from data, plays a vital role in Graph Signal Processing (GSP) and finds applications in various data processing domains. However, the existing approaches have primarily focused on learning deterministic graphs, and thus are not suitable for applications involving topological stochasticity, such as epidemiological models. In this paper, we develop a hierarchical Bayesian model for graph learning problem. Specifically, the generative model of smooth signals is formulated by transforming the graph topology into self-expressiveness coefficients and incorporating individual noise for each vertex. Tailored probability distributions are imposed on each edge to characterize the valid graph topology constraints along with edge-level probabilistic information. Building upon this, we derive the Bayesian Graph Learning (BGL) approach to efficiently estimate the graph structure in a distributed manner. In particular, based on the specific probabilistic dependencies, we derive a series of message passing rules by a mixture of Generalized Approximate Message Passing (GAMP) message and Belief Propagation (BP) message to iteratively approximate the posterior probabilities. Numerical experiments with both artificial and real data demonstrate that BGL learns more accurate graph structures and enhances machine learning tasks compared to state-of-the-art methods.
期刊介绍:
The IEEE Transactions on Signal Processing covers novel theory, algorithms, performance analyses and applications of techniques for the processing, understanding, learning, retrieval, mining, and extraction of information from signals. The term “signal” includes, among others, audio, video, speech, image, communication, geophysical, sonar, radar, medical and musical signals. Examples of topics of interest include, but are not limited to, information processing and the theory and application of filtering, coding, transmitting, estimating, detecting, analyzing, recognizing, synthesizing, recording, and reproducing signals.