{"title":"Gated Attention with Asymmetric Regularization for Transformer-based Continual Graph Learning","authors":"Hongxiang Lin, Ruiqi Jia, Xiaoqing Lyu","doi":"10.1145/3539618.3591991","DOIUrl":null,"url":null,"abstract":"Continual graph learning (CGL) aims to mitigate the topological-feature-induced catastrophic forgetting problem (TCF) in graph neural networks, which plays an essential role in the field of information retrieval. The TCF is mainly caused by the forgetting of node features of old tasks and the forgetting of topological features shared by old and new tasks. Existing CGL methods do not pay enough attention to the forgetting of topological features shared between different tasks. In this paper, we propose a transformer-based CGL method (Trans-CGL), thereby taking full advantage of the transformer's properties to mitigate the TCF problem. Specifically, to alleviate forgetting of node features, we introduce a gated attention mechanism for Trans-CGL based on parameter isolation that allows the model to be independent of each other when learning old and new tasks. Furthermore, to address the forgetting of shared parameters that store topological information between different tasks, we propose an asymmetric mask attention regularization module to constrain the shared attention parameters ensuring that the shared topological information is preserved. Comparative experiments show that the method achieves competitive performance on four real-world datasets.","PeriodicalId":425056,"journal":{"name":"Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval","volume":"16 4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3539618.3591991","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Continual graph learning (CGL) aims to mitigate the topological-feature-induced catastrophic forgetting problem (TCF) in graph neural networks, which plays an essential role in the field of information retrieval. The TCF is mainly caused by the forgetting of node features of old tasks and the forgetting of topological features shared by old and new tasks. Existing CGL methods do not pay enough attention to the forgetting of topological features shared between different tasks. In this paper, we propose a transformer-based CGL method (Trans-CGL), thereby taking full advantage of the transformer's properties to mitigate the TCF problem. Specifically, to alleviate forgetting of node features, we introduce a gated attention mechanism for Trans-CGL based on parameter isolation that allows the model to be independent of each other when learning old and new tasks. Furthermore, to address the forgetting of shared parameters that store topological information between different tasks, we propose an asymmetric mask attention regularization module to constrain the shared attention parameters ensuring that the shared topological information is preserved. Comparative experiments show that the method achieves competitive performance on four real-world datasets.