Chao Chen, Haoyu Geng, Nianzu Yang, Xiaokang Yang, Junchi Yan
{"title":"EasyDGL:连续时间动态图学习的编码、训练和解释。","authors":"Chao Chen, Haoyu Geng, Nianzu Yang, Xiaokang Yang, Junchi Yan","doi":"10.1109/TPAMI.2024.3443110","DOIUrl":null,"url":null,"abstract":"<p><p>Dynamic graphs arise in various real-world applications, and it is often welcomed to model the dynamics in continuous time domain for its flexibility. This paper aims to design an easy-to-use pipeline (EasyDGL which is also due to its implementation by DGL toolkit) composed of three modules with both strong fitting ability and interpretability, namely encoding, training and interpreting: i) a temporal point process (TPP) modulated attention architecture to endow the continuous-time resolution with the coupled spatiotemporal dynamics of the graph with edge-addition events; ii) a principled loss composed of task-agnostic TPP posterior maximization based on observed events, and a task-aware loss with a masking strategy over dynamic graph, where the tasks include dynamic link prediction, dynamic node classification and node traffic forecasting; iii) interpretation of the outputs (e.g., representations and predictions) with scalable perturbation-based quantitative analysis in the graph Fourier domain, which could comprehensively reflect the behavior of the learned model. Empirical results on public benchmarks show our superior performance for time-conditioned predictive tasks, and in particular EasyDGL can effectively quantify the predictive power of frequency content that a model learns from evolving graph data.</p>","PeriodicalId":94034,"journal":{"name":"IEEE transactions on pattern analysis and machine intelligence","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"EasyDGL: Encode, Train and Interpret for Continuous-Time Dynamic Graph Learning.\",\"authors\":\"Chao Chen, Haoyu Geng, Nianzu Yang, Xiaokang Yang, Junchi Yan\",\"doi\":\"10.1109/TPAMI.2024.3443110\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Dynamic graphs arise in various real-world applications, and it is often welcomed to model the dynamics in continuous time domain for its flexibility. This paper aims to design an easy-to-use pipeline (EasyDGL which is also due to its implementation by DGL toolkit) composed of three modules with both strong fitting ability and interpretability, namely encoding, training and interpreting: i) a temporal point process (TPP) modulated attention architecture to endow the continuous-time resolution with the coupled spatiotemporal dynamics of the graph with edge-addition events; ii) a principled loss composed of task-agnostic TPP posterior maximization based on observed events, and a task-aware loss with a masking strategy over dynamic graph, where the tasks include dynamic link prediction, dynamic node classification and node traffic forecasting; iii) interpretation of the outputs (e.g., representations and predictions) with scalable perturbation-based quantitative analysis in the graph Fourier domain, which could comprehensively reflect the behavior of the learned model. Empirical results on public benchmarks show our superior performance for time-conditioned predictive tasks, and in particular EasyDGL can effectively quantify the predictive power of frequency content that a model learns from evolving graph data.</p>\",\"PeriodicalId\":94034,\"journal\":{\"name\":\"IEEE transactions on pattern analysis and machine intelligence\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on pattern analysis and machine intelligence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/TPAMI.2024.3443110\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on pattern analysis and machine intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TPAMI.2024.3443110","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
EasyDGL: Encode, Train and Interpret for Continuous-Time Dynamic Graph Learning.
Dynamic graphs arise in various real-world applications, and it is often welcomed to model the dynamics in continuous time domain for its flexibility. This paper aims to design an easy-to-use pipeline (EasyDGL which is also due to its implementation by DGL toolkit) composed of three modules with both strong fitting ability and interpretability, namely encoding, training and interpreting: i) a temporal point process (TPP) modulated attention architecture to endow the continuous-time resolution with the coupled spatiotemporal dynamics of the graph with edge-addition events; ii) a principled loss composed of task-agnostic TPP posterior maximization based on observed events, and a task-aware loss with a masking strategy over dynamic graph, where the tasks include dynamic link prediction, dynamic node classification and node traffic forecasting; iii) interpretation of the outputs (e.g., representations and predictions) with scalable perturbation-based quantitative analysis in the graph Fourier domain, which could comprehensively reflect the behavior of the learned model. Empirical results on public benchmarks show our superior performance for time-conditioned predictive tasks, and in particular EasyDGL can effectively quantify the predictive power of frequency content that a model learns from evolving graph data.