{"title":"在属性和图嵌入表示之间具有结构一致性的解耦图神经网络","authors":"Jinlu Wang;Jipeng Guo;Yanfeng Sun;Junbin Gao;Shaofan Wang;Yachao Yang;Baocai Yin","doi":"10.1109/TBDATA.2024.3489420","DOIUrl":null,"url":null,"abstract":"Graph neural networks (GNNs) exhibit a robust capability for representation learning on graphs with complex structures, demonstrating superior performance across various applications. Most existing GNNs utilize graph convolution operations that integrate both attribute and structural information through coupled way. And these GNNs, from an optimization perspective, seek to learn a consensus and compromised embedding representation that balances attribute and graph information, selectively exploring and retaining valid information in essence. To obtain a more comprehensive embedding representation, a novel GNN framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced. DGNN separately explores distinctive embedding representations from the attribute and graph spaces by decoupled terms. Considering that the semantic graph, derived from attribute feature space, contains different node connection information and provides enhancement for the topological graph, both topological and semantic graphs are integrated by DGNN for powerful embedding representation learning. Further, structural consistency between the attribute embedding and the graph embedding is promoted to effectively eliminate redundant information and establish soft connection. This process involves facilitating factor sharing for adjacency matrices reconstruction, which aims at exploring consensus and high-level correlations. Finally, a more powerful and comprehensive representation is achieved through the concatenation of these embeddings. Experimental results conducted on several graph benchmark datasets demonstrate its superiority in node classification tasks.","PeriodicalId":13106,"journal":{"name":"IEEE Transactions on Big Data","volume":"11 4","pages":"1813-1827"},"PeriodicalIF":5.7000,"publicationDate":"2024-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"DGNN: Decoupled Graph Neural Networks With Structural Consistency Between Attribute and Graph Embedding Representations\",\"authors\":\"Jinlu Wang;Jipeng Guo;Yanfeng Sun;Junbin Gao;Shaofan Wang;Yachao Yang;Baocai Yin\",\"doi\":\"10.1109/TBDATA.2024.3489420\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Graph neural networks (GNNs) exhibit a robust capability for representation learning on graphs with complex structures, demonstrating superior performance across various applications. Most existing GNNs utilize graph convolution operations that integrate both attribute and structural information through coupled way. And these GNNs, from an optimization perspective, seek to learn a consensus and compromised embedding representation that balances attribute and graph information, selectively exploring and retaining valid information in essence. To obtain a more comprehensive embedding representation, a novel GNN framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced. DGNN separately explores distinctive embedding representations from the attribute and graph spaces by decoupled terms. Considering that the semantic graph, derived from attribute feature space, contains different node connection information and provides enhancement for the topological graph, both topological and semantic graphs are integrated by DGNN for powerful embedding representation learning. Further, structural consistency between the attribute embedding and the graph embedding is promoted to effectively eliminate redundant information and establish soft connection. This process involves facilitating factor sharing for adjacency matrices reconstruction, which aims at exploring consensus and high-level correlations. Finally, a more powerful and comprehensive representation is achieved through the concatenation of these embeddings. Experimental results conducted on several graph benchmark datasets demonstrate its superiority in node classification tasks.\",\"PeriodicalId\":13106,\"journal\":{\"name\":\"IEEE Transactions on Big Data\",\"volume\":\"11 4\",\"pages\":\"1813-1827\"},\"PeriodicalIF\":5.7000,\"publicationDate\":\"2024-10-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Big Data\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10740335/\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Big Data","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10740335/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
DGNN: Decoupled Graph Neural Networks With Structural Consistency Between Attribute and Graph Embedding Representations
Graph neural networks (GNNs) exhibit a robust capability for representation learning on graphs with complex structures, demonstrating superior performance across various applications. Most existing GNNs utilize graph convolution operations that integrate both attribute and structural information through coupled way. And these GNNs, from an optimization perspective, seek to learn a consensus and compromised embedding representation that balances attribute and graph information, selectively exploring and retaining valid information in essence. To obtain a more comprehensive embedding representation, a novel GNN framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced. DGNN separately explores distinctive embedding representations from the attribute and graph spaces by decoupled terms. Considering that the semantic graph, derived from attribute feature space, contains different node connection information and provides enhancement for the topological graph, both topological and semantic graphs are integrated by DGNN for powerful embedding representation learning. Further, structural consistency between the attribute embedding and the graph embedding is promoted to effectively eliminate redundant information and establish soft connection. This process involves facilitating factor sharing for adjacency matrices reconstruction, which aims at exploring consensus and high-level correlations. Finally, a more powerful and comprehensive representation is achieved through the concatenation of these embeddings. Experimental results conducted on several graph benchmark datasets demonstrate its superiority in node classification tasks.
期刊介绍:
The IEEE Transactions on Big Data publishes peer-reviewed articles focusing on big data. These articles present innovative research ideas and application results across disciplines, including novel theories, algorithms, and applications. Research areas cover a wide range, such as big data analytics, visualization, curation, management, semantics, infrastructure, standards, performance analysis, intelligence extraction, scientific discovery, security, privacy, and legal issues specific to big data. The journal also prioritizes applications of big data in fields generating massive datasets.