{"title":"Rethinking multi-level information fusion in temporal graphs: Pre-training then distilling for better embedding","authors":"Meng Liu , Yong Liu , Qianqian Ren , Meng Han","doi":"10.1016/j.inffus.2025.103127","DOIUrl":null,"url":null,"abstract":"<div><div>Temporal graphs occupy an important place in graph data, which store node interactions in sequences, thus enabling a more microscopic view of each node’s dynamics. However, many temporal graph methods primarily concentrate on shallow-level temporal or neighborhood information, while acquiring deep-level community or global graph information necessitates increased computational costs, thereby significantly impacting model efficiency. Inspired by this, we rethink how this information is acquired: if it is difficult to acquire it during model training, why not obtain it before training? Consequently, we propose ReMIT, a novel method for temporal graph learning, which incorporates the concepts of feature pre-training and knowledge distillation to Rethink the embedding of Multi-level Information fusion in Temporal graphs. ReMIT facilitates the “remitting” of prior knowledge to model, wherein hard-to-access information is captured and distilled to the train module by introducing a pre-train module. Experimental results on multiple real-world datasets validate the validity and feasibility of our proposed framework. Our method improves performance by up to 10.2% while reducing almost 30% training time.</div></div>","PeriodicalId":50367,"journal":{"name":"Information Fusion","volume":"121 ","pages":"Article 103127"},"PeriodicalIF":14.7000,"publicationDate":"2025-03-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Fusion","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1566253525002003","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Temporal graphs occupy an important place in graph data, which store node interactions in sequences, thus enabling a more microscopic view of each node’s dynamics. However, many temporal graph methods primarily concentrate on shallow-level temporal or neighborhood information, while acquiring deep-level community or global graph information necessitates increased computational costs, thereby significantly impacting model efficiency. Inspired by this, we rethink how this information is acquired: if it is difficult to acquire it during model training, why not obtain it before training? Consequently, we propose ReMIT, a novel method for temporal graph learning, which incorporates the concepts of feature pre-training and knowledge distillation to Rethink the embedding of Multi-level Information fusion in Temporal graphs. ReMIT facilitates the “remitting” of prior knowledge to model, wherein hard-to-access information is captured and distilled to the train module by introducing a pre-train module. Experimental results on multiple real-world datasets validate the validity and feasibility of our proposed framework. Our method improves performance by up to 10.2% while reducing almost 30% training time.
期刊介绍:
Information Fusion serves as a central platform for showcasing advancements in multi-sensor, multi-source, multi-process information fusion, fostering collaboration among diverse disciplines driving its progress. It is the leading outlet for sharing research and development in this field, focusing on architectures, algorithms, and applications. Papers dealing with fundamental theoretical analyses as well as those demonstrating their application to real-world problems will be welcome.