Pengda Wang;Mingjie Lu;Weiqing Yan;Dong Yang;Zhaowei Liu
{"title":"基于遗传编程自动搜索超参数的图结构学习","authors":"Pengda Wang;Mingjie Lu;Weiqing Yan;Dong Yang;Zhaowei Liu","doi":"10.1109/TETCI.2024.3386833","DOIUrl":null,"url":null,"abstract":"Graph neural networks (GNNs) rely heavily on graph structures and artificial hyperparameters, which may increase computation and affect performance. Most GNNs use original graphs, but the original graph data has problems with noise and incomplete information, which easily leads to poor GNN performance. For this kind of problem, recent graph structure learning methods consider how to generate graph structures containing label information. The settings of some hyperparameters will also affect the expression of the GNN model. This paper proposes a genetic graph structure learning method (Genetic-GSL). Different from the existing graph structure learning methods, this paper not only optimizes the graph structure but also the hyperparameters. Specifically, different graph structures and different hyperparameters are used as parents; the offspring are cross-mutated through the parents; and then excellent offspring are selected through evaluation to achieve dynamic fitting of the graph structure and hyperparameters. Experiments show that, compared with other methods, Genetic-GSL basically improves the performance of node classification tasks by 1.2%. With the increase in evolution algebra, Genetic-GSL has good performance on node classification tasks and resistance to adversarial attacks.","PeriodicalId":13135,"journal":{"name":"IEEE Transactions on Emerging Topics in Computational Intelligence","volume":"8 6","pages":"4155-4164"},"PeriodicalIF":5.3000,"publicationDate":"2024-04-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Graph Structure Learning With Automatic Search of Hyperparameters Based on Genetic Programming\",\"authors\":\"Pengda Wang;Mingjie Lu;Weiqing Yan;Dong Yang;Zhaowei Liu\",\"doi\":\"10.1109/TETCI.2024.3386833\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Graph neural networks (GNNs) rely heavily on graph structures and artificial hyperparameters, which may increase computation and affect performance. Most GNNs use original graphs, but the original graph data has problems with noise and incomplete information, which easily leads to poor GNN performance. For this kind of problem, recent graph structure learning methods consider how to generate graph structures containing label information. The settings of some hyperparameters will also affect the expression of the GNN model. This paper proposes a genetic graph structure learning method (Genetic-GSL). Different from the existing graph structure learning methods, this paper not only optimizes the graph structure but also the hyperparameters. Specifically, different graph structures and different hyperparameters are used as parents; the offspring are cross-mutated through the parents; and then excellent offspring are selected through evaluation to achieve dynamic fitting of the graph structure and hyperparameters. Experiments show that, compared with other methods, Genetic-GSL basically improves the performance of node classification tasks by 1.2%. With the increase in evolution algebra, Genetic-GSL has good performance on node classification tasks and resistance to adversarial attacks.\",\"PeriodicalId\":13135,\"journal\":{\"name\":\"IEEE Transactions on Emerging Topics in Computational Intelligence\",\"volume\":\"8 6\",\"pages\":\"4155-4164\"},\"PeriodicalIF\":5.3000,\"publicationDate\":\"2024-04-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Emerging Topics in Computational Intelligence\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10504544/\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Emerging Topics in Computational Intelligence","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10504544/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Graph Structure Learning With Automatic Search of Hyperparameters Based on Genetic Programming
Graph neural networks (GNNs) rely heavily on graph structures and artificial hyperparameters, which may increase computation and affect performance. Most GNNs use original graphs, but the original graph data has problems with noise and incomplete information, which easily leads to poor GNN performance. For this kind of problem, recent graph structure learning methods consider how to generate graph structures containing label information. The settings of some hyperparameters will also affect the expression of the GNN model. This paper proposes a genetic graph structure learning method (Genetic-GSL). Different from the existing graph structure learning methods, this paper not only optimizes the graph structure but also the hyperparameters. Specifically, different graph structures and different hyperparameters are used as parents; the offspring are cross-mutated through the parents; and then excellent offspring are selected through evaluation to achieve dynamic fitting of the graph structure and hyperparameters. Experiments show that, compared with other methods, Genetic-GSL basically improves the performance of node classification tasks by 1.2%. With the increase in evolution algebra, Genetic-GSL has good performance on node classification tasks and resistance to adversarial attacks.
期刊介绍:
The IEEE Transactions on Emerging Topics in Computational Intelligence (TETCI) publishes original articles on emerging aspects of computational intelligence, including theory, applications, and surveys.
TETCI is an electronics only publication. TETCI publishes six issues per year.
Authors are encouraged to submit manuscripts in any emerging topic in computational intelligence, especially nature-inspired computing topics not covered by other IEEE Computational Intelligence Society journals. A few such illustrative examples are glial cell networks, computational neuroscience, Brain Computer Interface, ambient intelligence, non-fuzzy computing with words, artificial life, cultural learning, artificial endocrine networks, social reasoning, artificial hormone networks, computational intelligence for the IoT and Smart-X technologies.