{"title":"纯净骨架动态超图神经网络","authors":"","doi":"10.1016/j.neucom.2024.128539","DOIUrl":null,"url":null,"abstract":"<div><p>Recently, in the field of Hypergraph Neural Networks (HGNNs), the effectiveness of dynamic hypergraph construction has been validated, which aims to reduce structural noise within the hypergraph through embeddings. However, the existing dynamic construction methods fail to notice the reduction of information contained in the hypergraphs during dynamic updates. This limitation undermines the quality of hypergraphs. Moreover, dynamic hypergraphs are constructed from graphs. Several key nodes play a crucial role in graph, but they are overlooked in hypergraphs. In this paper, we propose a <strong>P</strong>urity <strong>S</strong>keleton <strong>D</strong>ynamic <strong>H</strong>ypergraph <strong>N</strong>eural <strong>N</strong>etwork (PS-DHGNN) to address the above issues. Firstly, we leverage purity skeleton method to dynamically construct hypergraphs via the fusion embeddings of features and topology simultaneously. This method effectively reduces structural noise and prevents the loss of information. Secondly, we employ an incremental training strategy, which implements a batch training strategy based on the importance of nodes. The key nodes, as the skeleton of hypergraph, are still highly valued. In addition, we utilize a novel loss function for learning structure information between hypergraph and graph. We conduct extensive experiments on node classification and clustering tasks, which demonstrate that our PS-DHGNN outperforms state-of-the-art methods. Note on real-world traffic flow datasets, PS-DHGNN demonstrates excellent performance, which is highly meaningful in practice.</p></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":null,"pages":null},"PeriodicalIF":5.5000,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Purity Skeleton Dynamic Hypergraph Neural Network\",\"authors\":\"\",\"doi\":\"10.1016/j.neucom.2024.128539\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Recently, in the field of Hypergraph Neural Networks (HGNNs), the effectiveness of dynamic hypergraph construction has been validated, which aims to reduce structural noise within the hypergraph through embeddings. However, the existing dynamic construction methods fail to notice the reduction of information contained in the hypergraphs during dynamic updates. This limitation undermines the quality of hypergraphs. Moreover, dynamic hypergraphs are constructed from graphs. Several key nodes play a crucial role in graph, but they are overlooked in hypergraphs. In this paper, we propose a <strong>P</strong>urity <strong>S</strong>keleton <strong>D</strong>ynamic <strong>H</strong>ypergraph <strong>N</strong>eural <strong>N</strong>etwork (PS-DHGNN) to address the above issues. Firstly, we leverage purity skeleton method to dynamically construct hypergraphs via the fusion embeddings of features and topology simultaneously. This method effectively reduces structural noise and prevents the loss of information. Secondly, we employ an incremental training strategy, which implements a batch training strategy based on the importance of nodes. The key nodes, as the skeleton of hypergraph, are still highly valued. In addition, we utilize a novel loss function for learning structure information between hypergraph and graph. We conduct extensive experiments on node classification and clustering tasks, which demonstrate that our PS-DHGNN outperforms state-of-the-art methods. Note on real-world traffic flow datasets, PS-DHGNN demonstrates excellent performance, which is highly meaningful in practice.</p></div>\",\"PeriodicalId\":19268,\"journal\":{\"name\":\"Neurocomputing\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":5.5000,\"publicationDate\":\"2024-09-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neurocomputing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0925231224013109\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231224013109","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Recently, in the field of Hypergraph Neural Networks (HGNNs), the effectiveness of dynamic hypergraph construction has been validated, which aims to reduce structural noise within the hypergraph through embeddings. However, the existing dynamic construction methods fail to notice the reduction of information contained in the hypergraphs during dynamic updates. This limitation undermines the quality of hypergraphs. Moreover, dynamic hypergraphs are constructed from graphs. Several key nodes play a crucial role in graph, but they are overlooked in hypergraphs. In this paper, we propose a Purity Skeleton Dynamic Hypergraph Neural Network (PS-DHGNN) to address the above issues. Firstly, we leverage purity skeleton method to dynamically construct hypergraphs via the fusion embeddings of features and topology simultaneously. This method effectively reduces structural noise and prevents the loss of information. Secondly, we employ an incremental training strategy, which implements a batch training strategy based on the importance of nodes. The key nodes, as the skeleton of hypergraph, are still highly valued. In addition, we utilize a novel loss function for learning structure information between hypergraph and graph. We conduct extensive experiments on node classification and clustering tasks, which demonstrate that our PS-DHGNN outperforms state-of-the-art methods. Note on real-world traffic flow datasets, PS-DHGNN demonstrates excellent performance, which is highly meaningful in practice.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.