{"title":"Graph Quaternion-Valued Attention Networks for Node Classification","authors":"Jingchao Wang, Tongxu Lin, Guoheng Huang","doi":"10.1145/3603781.3603900","DOIUrl":null,"url":null,"abstract":"Node classification is a prominent graph-based task and various Graph neural networks (GNNs) models have been applied for solving it. In this paper, we introduce a novel GNN architecture for node classification called Graph Quaternion-Valued Attention Networks (GQAT), which enhances the original graph attention networks by replacing the vector multiplication in self-attention with quaternion vector multiplication. One of the primary advantages of GQAT is the significant reduction in model parameters, as quaternion operations require only 1/4 of the calculation matrix, contributing to a more lightweight model. Moreover, GQAT excels at capturing intricate relationships between nodes, owing to the sophisticated nature of quaternion operations. We conduct extensive experiments on Cora, Citeseer, and Pubmed for node classification. The results demonstrate that GQAT outperforms conventional graph attention networks in terms of node classification accuracy while requiring fewer parameters.","PeriodicalId":391180,"journal":{"name":"Proceedings of the 2023 4th International Conference on Computing, Networks and Internet of Things","volume":"14 2","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2023 4th International Conference on Computing, Networks and Internet of Things","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3603781.3603900","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Node classification is a prominent graph-based task and various Graph neural networks (GNNs) models have been applied for solving it. In this paper, we introduce a novel GNN architecture for node classification called Graph Quaternion-Valued Attention Networks (GQAT), which enhances the original graph attention networks by replacing the vector multiplication in self-attention with quaternion vector multiplication. One of the primary advantages of GQAT is the significant reduction in model parameters, as quaternion operations require only 1/4 of the calculation matrix, contributing to a more lightweight model. Moreover, GQAT excels at capturing intricate relationships between nodes, owing to the sophisticated nature of quaternion operations. We conduct extensive experiments on Cora, Citeseer, and Pubmed for node classification. The results demonstrate that GQAT outperforms conventional graph attention networks in terms of node classification accuracy while requiring fewer parameters.