{"title":"图数据的学习和推理:神经和统计关系方法(特邀论文)","authors":"M. Jaeger","doi":"10.4230/OASIcs.AIB.2022.5","DOIUrl":null,"url":null,"abstract":"Graph neural networks (GNNs) have emerged in recent years as a very powerful and popular modeling tool for graph and network data. Though much of the work on GNNs has focused on graphs with a single edge relation, they have also been adapted to multi-relational graphs, including knowledge graphs. In such multi-relational domains, the objectives and possible applications of GNNs become quite similar to what for many years has been investigated and developed in the field of statistical relational learning (SRL). This article first gives a brief overview of the main features of GNN and SRL approaches to learning and reasoning with graph data. It analyzes then in more detail their commonalities and differences with respect to semantics, representation, parameterization, interpretability, and flexibility. A particular focus will be on relational Bayesian networks (RBNs) as the SRL framework that is most closely related to GNNs. We show how common GNN architectures can be directly encoded as RBNs, thus enabling the direct integration of “low level” neural model components with the “high level” symbolic representation and flexible inference capabilities of SRL. 2012 ACM Subject Classification Computing methodologies → Logical and relational learning; Computing methodologies → Neural networks","PeriodicalId":110801,"journal":{"name":"International Research School in Artificial Intelligence in Bergen","volume":"32 ","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Learning and Reasoning with Graph Data: Neural and Statistical-Relational Approaches (Invited Paper)\",\"authors\":\"M. Jaeger\",\"doi\":\"10.4230/OASIcs.AIB.2022.5\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Graph neural networks (GNNs) have emerged in recent years as a very powerful and popular modeling tool for graph and network data. Though much of the work on GNNs has focused on graphs with a single edge relation, they have also been adapted to multi-relational graphs, including knowledge graphs. In such multi-relational domains, the objectives and possible applications of GNNs become quite similar to what for many years has been investigated and developed in the field of statistical relational learning (SRL). This article first gives a brief overview of the main features of GNN and SRL approaches to learning and reasoning with graph data. It analyzes then in more detail their commonalities and differences with respect to semantics, representation, parameterization, interpretability, and flexibility. A particular focus will be on relational Bayesian networks (RBNs) as the SRL framework that is most closely related to GNNs. We show how common GNN architectures can be directly encoded as RBNs, thus enabling the direct integration of “low level” neural model components with the “high level” symbolic representation and flexible inference capabilities of SRL. 2012 ACM Subject Classification Computing methodologies → Logical and relational learning; Computing methodologies → Neural networks\",\"PeriodicalId\":110801,\"journal\":{\"name\":\"International Research School in Artificial Intelligence in Bergen\",\"volume\":\"32 \",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1900-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Research School in Artificial Intelligence in Bergen\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.4230/OASIcs.AIB.2022.5\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Research School in Artificial Intelligence in Bergen","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4230/OASIcs.AIB.2022.5","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Learning and Reasoning with Graph Data: Neural and Statistical-Relational Approaches (Invited Paper)
Graph neural networks (GNNs) have emerged in recent years as a very powerful and popular modeling tool for graph and network data. Though much of the work on GNNs has focused on graphs with a single edge relation, they have also been adapted to multi-relational graphs, including knowledge graphs. In such multi-relational domains, the objectives and possible applications of GNNs become quite similar to what for many years has been investigated and developed in the field of statistical relational learning (SRL). This article first gives a brief overview of the main features of GNN and SRL approaches to learning and reasoning with graph data. It analyzes then in more detail their commonalities and differences with respect to semantics, representation, parameterization, interpretability, and flexibility. A particular focus will be on relational Bayesian networks (RBNs) as the SRL framework that is most closely related to GNNs. We show how common GNN architectures can be directly encoded as RBNs, thus enabling the direct integration of “low level” neural model components with the “high level” symbolic representation and flexible inference capabilities of SRL. 2012 ACM Subject Classification Computing methodologies → Logical and relational learning; Computing methodologies → Neural networks