Scalable Hypergraph Structure Learning With Diverse Smoothness Priors

IF 3 3区 计算机科学 Q2 ENGINEERING, ELECTRICAL & ELECTRONIC
Benjamin T. Brown;Haoxiang Zhang;Daniel L. Lau;Gonzalo R. Arce
{"title":"Scalable Hypergraph Structure Learning With Diverse Smoothness Priors","authors":"Benjamin T. Brown;Haoxiang Zhang;Daniel L. Lau;Gonzalo R. Arce","doi":"10.1109/TSIPN.2025.3599780","DOIUrl":null,"url":null,"abstract":"In graph signal processing, learning weighted connections between nodes from signals is a fundamental task when the underlying relationships are unknown. With the extension of graphs to hypergraphs, where edges can connect more than two nodes, graph learning methods have similarly been generalized to hypergraphs. However, the absence of a unified framework for calculating total variation has led to divergent definitions of smoothness and, consequently, differing approaches to hyperedge recovery. This challenge is confronted in this work through generalization of several previously proposed hypergraph total variations, allowing ease of substitution into a vector-based optimization. To this end, a novel hypergraph learning method is proposed that recovers a hypergraph topology from time-series signals using convex optimization based on a smoothness prior. This approach, designated Hypergraph Structure Learning with Smoothness (HSLS), addresses key limitations in prior works such as hyperedge selection and convergence issues. Additionally, a process is introduced that limits the span of the hyperedge search and maintains a valid hyperedge selection set, creating a scalable model. Experimental results demonstrate improved performance over state-of-the-art hypergraph inference methods. The method is empirically shown to be robust to total variation terms, biased towards global smoothness, and scalable to larger hypergraphs.","PeriodicalId":56268,"journal":{"name":"IEEE Transactions on Signal and Information Processing over Networks","volume":"11 ","pages":"1072-1086"},"PeriodicalIF":3.0000,"publicationDate":"2025-08-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Signal and Information Processing over Networks","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11126977/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

In graph signal processing, learning weighted connections between nodes from signals is a fundamental task when the underlying relationships are unknown. With the extension of graphs to hypergraphs, where edges can connect more than two nodes, graph learning methods have similarly been generalized to hypergraphs. However, the absence of a unified framework for calculating total variation has led to divergent definitions of smoothness and, consequently, differing approaches to hyperedge recovery. This challenge is confronted in this work through generalization of several previously proposed hypergraph total variations, allowing ease of substitution into a vector-based optimization. To this end, a novel hypergraph learning method is proposed that recovers a hypergraph topology from time-series signals using convex optimization based on a smoothness prior. This approach, designated Hypergraph Structure Learning with Smoothness (HSLS), addresses key limitations in prior works such as hyperedge selection and convergence issues. Additionally, a process is introduced that limits the span of the hyperedge search and maintains a valid hyperedge selection set, creating a scalable model. Experimental results demonstrate improved performance over state-of-the-art hypergraph inference methods. The method is empirically shown to be robust to total variation terms, biased towards global smoothness, and scalable to larger hypergraphs.
具有不同平滑先验的可伸缩超图结构学习
在图信号处理中,当底层关系未知时,从信号中学习节点之间的加权连接是一项基本任务。随着图的扩展到超图,其中的边可以连接两个以上的节点,图学习方法也被类似地推广到超图。然而,由于缺乏计算总变异的统一框架,导致了对平滑度的不同定义,从而导致了对超边缘恢复的不同方法。这项工作通过对先前提出的几个超图总变化的泛化来应对这一挑战,从而可以轻松地替换为基于向量的优化。为此,提出了一种新的超图学习方法,利用基于平滑先验的凸优化从时间序列信号中恢复超图拓扑。这种方法被称为平滑超图结构学习(HSLS),解决了先前工作中的关键限制,如超边缘选择和收敛问题。此外,还引入了一个过程,该过程限制了超边缘搜索的范围,并维护了一个有效的超边缘选择集,从而创建了一个可扩展的模型。实验结果表明,与最先进的超图推理方法相比,性能有所提高。经验表明,该方法对总变差项具有鲁棒性,偏向于全局光滑性,并且可扩展到更大的超图。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Transactions on Signal and Information Processing over Networks
IEEE Transactions on Signal and Information Processing over Networks Computer Science-Computer Networks and Communications
CiteScore
5.80
自引率
12.50%
发文量
56
期刊介绍: The IEEE Transactions on Signal and Information Processing over Networks publishes high-quality papers that extend the classical notions of processing of signals defined over vector spaces (e.g. time and space) to processing of signals and information (data) defined over networks, potentially dynamically varying. In signal processing over networks, the topology of the network may define structural relationships in the data, or may constrain processing of the data. Topics include distributed algorithms for filtering, detection, estimation, adaptation and learning, model selection, data fusion, and diffusion or evolution of information over such networks, and applications of distributed signal processing.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信