Tao Yu, Yao Fu, Linghui Hu, Huizhao Wang, Weihao Jiang, Shi Pu
{"title":"认知自己:通过核心图形认知和区分进行图形预训练","authors":"Tao Yu, Yao Fu, Linghui Hu, Huizhao Wang, Weihao Jiang, Shi Pu","doi":"10.1145/3511808.3557259","DOIUrl":null,"url":null,"abstract":"While Graph Neural Networks (GNNs) have become de facto criterion in graph representation learning, they still suffer from label scarcity and poor generalization. To alleviate these issues, graph pre-training has been proposed to learn universal patterns from unlabeled data via applying self-supervised tasks. Most existing graph pre-training methods only use a single self-supervised task, which will lead to insufficient knowledge mining. Recently, there are also some works that try to use multiple self-supervised tasks, however, we argue that these methods still suffer from a serious problem, which we call it graph structure impairment. That is, there actually exists structural gaps among several tasks due to the divergence of optimization objectives, which means customized graph structures should be provided for different self-supervised tasks. Graph structure impairment not only significantly hurts the generalizability of pre-trained GNNs, but also leads to suboptimal solution, and there is no study so far to address it well. Motivated by Meta-Cognitive theory, we propose a novel model named Core Graph Cognizing and Differentiating (CORE) to deal with the problem in an effective approach. Specifically, CORE consists of cognizing network and differentiating process, the former cognizes a core graph which stands for the essential structure of the graph, and the latter allows it to differentiate into several task-specific graphs for different tasks. Besides, this is also the first study to combine graph pre-training with cognitive theory to build a cognition-aware model. Several experiments have been conducted to demonstrate the effectiveness of CORE.","PeriodicalId":389624,"journal":{"name":"Proceedings of the 31st ACM International Conference on Information & Knowledge Management","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Cognize Yourself: Graph Pre-Training via Core Graph Cognizing and Differentiating\",\"authors\":\"Tao Yu, Yao Fu, Linghui Hu, Huizhao Wang, Weihao Jiang, Shi Pu\",\"doi\":\"10.1145/3511808.3557259\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"While Graph Neural Networks (GNNs) have become de facto criterion in graph representation learning, they still suffer from label scarcity and poor generalization. To alleviate these issues, graph pre-training has been proposed to learn universal patterns from unlabeled data via applying self-supervised tasks. Most existing graph pre-training methods only use a single self-supervised task, which will lead to insufficient knowledge mining. Recently, there are also some works that try to use multiple self-supervised tasks, however, we argue that these methods still suffer from a serious problem, which we call it graph structure impairment. That is, there actually exists structural gaps among several tasks due to the divergence of optimization objectives, which means customized graph structures should be provided for different self-supervised tasks. Graph structure impairment not only significantly hurts the generalizability of pre-trained GNNs, but also leads to suboptimal solution, and there is no study so far to address it well. Motivated by Meta-Cognitive theory, we propose a novel model named Core Graph Cognizing and Differentiating (CORE) to deal with the problem in an effective approach. Specifically, CORE consists of cognizing network and differentiating process, the former cognizes a core graph which stands for the essential structure of the graph, and the latter allows it to differentiate into several task-specific graphs for different tasks. Besides, this is also the first study to combine graph pre-training with cognitive theory to build a cognition-aware model. Several experiments have been conducted to demonstrate the effectiveness of CORE.\",\"PeriodicalId\":389624,\"journal\":{\"name\":\"Proceedings of the 31st ACM International Conference on Information & Knowledge Management\",\"volume\":\"24 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 31st ACM International Conference on Information & Knowledge Management\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3511808.3557259\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 31st ACM International Conference on Information & Knowledge Management","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3511808.3557259","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Cognize Yourself: Graph Pre-Training via Core Graph Cognizing and Differentiating
While Graph Neural Networks (GNNs) have become de facto criterion in graph representation learning, they still suffer from label scarcity and poor generalization. To alleviate these issues, graph pre-training has been proposed to learn universal patterns from unlabeled data via applying self-supervised tasks. Most existing graph pre-training methods only use a single self-supervised task, which will lead to insufficient knowledge mining. Recently, there are also some works that try to use multiple self-supervised tasks, however, we argue that these methods still suffer from a serious problem, which we call it graph structure impairment. That is, there actually exists structural gaps among several tasks due to the divergence of optimization objectives, which means customized graph structures should be provided for different self-supervised tasks. Graph structure impairment not only significantly hurts the generalizability of pre-trained GNNs, but also leads to suboptimal solution, and there is no study so far to address it well. Motivated by Meta-Cognitive theory, we propose a novel model named Core Graph Cognizing and Differentiating (CORE) to deal with the problem in an effective approach. Specifically, CORE consists of cognizing network and differentiating process, the former cognizes a core graph which stands for the essential structure of the graph, and the latter allows it to differentiate into several task-specific graphs for different tasks. Besides, this is also the first study to combine graph pre-training with cognitive theory to build a cognition-aware model. Several experiments have been conducted to demonstrate the effectiveness of CORE.