Benjamin T. Brown;Haoxiang Zhang;Daniel L. Lau;Gonzalo R. Arce
{"title":"Scalable Hypergraph Structure Learning With Diverse Smoothness Priors","authors":"Benjamin T. Brown;Haoxiang Zhang;Daniel L. Lau;Gonzalo R. Arce","doi":"10.1109/TSIPN.2025.3599780","DOIUrl":null,"url":null,"abstract":"In graph signal processing, learning weighted connections between nodes from signals is a fundamental task when the underlying relationships are unknown. With the extension of graphs to hypergraphs, where edges can connect more than two nodes, graph learning methods have similarly been generalized to hypergraphs. However, the absence of a unified framework for calculating total variation has led to divergent definitions of smoothness and, consequently, differing approaches to hyperedge recovery. This challenge is confronted in this work through generalization of several previously proposed hypergraph total variations, allowing ease of substitution into a vector-based optimization. To this end, a novel hypergraph learning method is proposed that recovers a hypergraph topology from time-series signals using convex optimization based on a smoothness prior. This approach, designated Hypergraph Structure Learning with Smoothness (HSLS), addresses key limitations in prior works such as hyperedge selection and convergence issues. Additionally, a process is introduced that limits the span of the hyperedge search and maintains a valid hyperedge selection set, creating a scalable model. Experimental results demonstrate improved performance over state-of-the-art hypergraph inference methods. The method is empirically shown to be robust to total variation terms, biased towards global smoothness, and scalable to larger hypergraphs.","PeriodicalId":56268,"journal":{"name":"IEEE Transactions on Signal and Information Processing over Networks","volume":"11 ","pages":"1072-1086"},"PeriodicalIF":3.0000,"publicationDate":"2025-08-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Signal and Information Processing over Networks","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11126977/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
In graph signal processing, learning weighted connections between nodes from signals is a fundamental task when the underlying relationships are unknown. With the extension of graphs to hypergraphs, where edges can connect more than two nodes, graph learning methods have similarly been generalized to hypergraphs. However, the absence of a unified framework for calculating total variation has led to divergent definitions of smoothness and, consequently, differing approaches to hyperedge recovery. This challenge is confronted in this work through generalization of several previously proposed hypergraph total variations, allowing ease of substitution into a vector-based optimization. To this end, a novel hypergraph learning method is proposed that recovers a hypergraph topology from time-series signals using convex optimization based on a smoothness prior. This approach, designated Hypergraph Structure Learning with Smoothness (HSLS), addresses key limitations in prior works such as hyperedge selection and convergence issues. Additionally, a process is introduced that limits the span of the hyperedge search and maintains a valid hyperedge selection set, creating a scalable model. Experimental results demonstrate improved performance over state-of-the-art hypergraph inference methods. The method is empirically shown to be robust to total variation terms, biased towards global smoothness, and scalable to larger hypergraphs.
期刊介绍:
The IEEE Transactions on Signal and Information Processing over Networks publishes high-quality papers that extend the classical notions of processing of signals defined over vector spaces (e.g. time and space) to processing of signals and information (data) defined over networks, potentially dynamically varying. In signal processing over networks, the topology of the network may define structural relationships in the data, or may constrain processing of the data. Topics include distributed algorithms for filtering, detection, estimation, adaptation and learning, model selection, data fusion, and diffusion or evolution of information over such networks, and applications of distributed signal processing.