Generic network sparsification via hybrid edge sampling

IF 3.7 3区 计算机科学 Q2 AUTOMATION & CONTROL SYSTEMS
Zhen Su , Jürgen Kurths , Henning Meyerhenke
{"title":"Generic network sparsification via hybrid edge sampling","authors":"Zhen Su ,&nbsp;Jürgen Kurths ,&nbsp;Henning Meyerhenke","doi":"10.1016/j.jfranklin.2024.107404","DOIUrl":null,"url":null,"abstract":"<div><div>Network (or graph) sparsification benefits downstream graph mining tasks. Finding a sparsified subgraph <span><math><mover><mrow><mi>G</mi></mrow><mrow><mo>ˆ</mo></mrow></mover></math></span> similar to the original graph <span><math><mi>G</mi></math></span> is, however, challenging due to the requirement of preserving various (or at least representative) network properties. In this paper, we propose a general hybrid edge sampling scheme named LOGA, as the combination of the <u>Lo</u>cal-filtering-based Random Edge sampling (LRE) (Hamann et al., 2016) and the <u>Ga</u>me-theoretic Sparsification with Tolerance (GST) (Su et al., 2022). LOGA fully utilizes the advantages of GST — in preserving complex structural properties by preserving local node properties in expectation – and LRE – in preserving the connectivity of a given network. Specifically, we first prove the existence of multiple equilibria in GST. This insight leads us to propose LOGA and its variant LOGA<span><math><msup><mrow></mrow><mrow><mi>s</mi><mi>c</mi></mrow></msup></math></span> by refining GST. LOGA is obtained by regarding LRE as an empirically good initializer for GST, while LOGA<span><math><msup><mrow></mrow><mrow><mi>s</mi><mi>c</mi></mrow></msup></math></span> is obtained by further including a constrained update for GST. In this way, LOGA/LOGA<span><math><msup><mrow></mrow><mrow><mi>s</mi><mi>c</mi></mrow></msup></math></span> generalize the work on GST to graphs with weights and different densities, without increasing the asymptotic time complexity. Extensive experiments on 26 weighted and unweighted networks with different densities demonstrate that LOGA<span><math><msup><mrow></mrow><mrow><mi>s</mi><mi>c</mi></mrow></msup></math></span> performs best for all 26 instances, i.e., they preserve representative network properties better than state-of-the-art sampling methods alone.</div></div>","PeriodicalId":17283,"journal":{"name":"Journal of The Franklin Institute-engineering and Applied Mathematics","volume":"362 1","pages":"Article 107404"},"PeriodicalIF":3.7000,"publicationDate":"2024-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of The Franklin Institute-engineering and Applied Mathematics","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0016003224008251","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

Network (or graph) sparsification benefits downstream graph mining tasks. Finding a sparsified subgraph Gˆ similar to the original graph G is, however, challenging due to the requirement of preserving various (or at least representative) network properties. In this paper, we propose a general hybrid edge sampling scheme named LOGA, as the combination of the Local-filtering-based Random Edge sampling (LRE) (Hamann et al., 2016) and the Game-theoretic Sparsification with Tolerance (GST) (Su et al., 2022). LOGA fully utilizes the advantages of GST — in preserving complex structural properties by preserving local node properties in expectation – and LRE – in preserving the connectivity of a given network. Specifically, we first prove the existence of multiple equilibria in GST. This insight leads us to propose LOGA and its variant LOGAsc by refining GST. LOGA is obtained by regarding LRE as an empirically good initializer for GST, while LOGAsc is obtained by further including a constrained update for GST. In this way, LOGA/LOGAsc generalize the work on GST to graphs with weights and different densities, without increasing the asymptotic time complexity. Extensive experiments on 26 weighted and unweighted networks with different densities demonstrate that LOGAsc performs best for all 26 instances, i.e., they preserve representative network properties better than state-of-the-art sampling methods alone.
通过混合边缘采样实现通用网络稀疏化
网络(或图)稀疏化有利于下游图挖掘任务。然而,由于需要保留各种(或至少具有代表性的)网络属性,寻找与原始图 G 相似的稀疏化子图 Gˆ 是一项具有挑战性的工作。在本文中,我们提出了一种名为 LOGA 的通用混合边缘采样方案,它是基于局部过滤的随机边缘采样(LRE)(Hamann 等人,2016 年)和具有容忍度的博弈论稀疏化(GST)(Su 等人,2022 年)的结合。LOGA 充分利用了 GST 和 LRE 的优势,前者通过在期望中保留局部节点属性来保留复杂的结构属性,后者则保留了给定网络的连通性。具体来说,我们首先证明了 GST 中多重均衡的存在。这一洞察力促使我们通过改进 GST 提出了 LOGA 及其变体 LOGAsc。LOGA 是通过将 LRE 视为 GST 的经验良好初始化器而得到的,而 LOGAsc 则是通过进一步加入 GST 的受限更新而得到的。这样,LOGA/LOGAsc 在不增加渐进时间复杂度的情况下,将 GST 的研究成果推广到了有权重和不同密度的图中。在 26 个具有不同密度的加权和非加权网络上进行的大量实验表明,LOGAsc 在所有 26 个实例中的表现都是最好的,也就是说,它们比最先进的单独采样方法更好地保留了具有代表性的网络属性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
7.30
自引率
14.60%
发文量
586
审稿时长
6.9 months
期刊介绍: The Journal of The Franklin Institute has an established reputation for publishing high-quality papers in the field of engineering and applied mathematics. Its current focus is on control systems, complex networks and dynamic systems, signal processing and communications and their applications. All submitted papers are peer-reviewed. The Journal will publish original research papers and research review papers of substance. Papers and special focus issues are judged upon possible lasting value, which has been and continues to be the strength of the Journal of The Franklin Institute.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信