Lapenna Michela, Faglioni Francesco, Fioresi Rita, Bruno Giovanni
{"title":"基于温度的图神经网络输入特征剪枝","authors":"Lapenna Michela, Faglioni Francesco, Fioresi Rita, Bruno Giovanni","doi":"10.1140/epjp/s13360-025-06804-0","DOIUrl":null,"url":null,"abstract":"<div><p>In the present work, we employ the concept of neural network temperature to prune unimportant features in input to a Graph Neural Network (GNN) architecture. In benchmark datasets for node and graph property prediction, each node comes equipped with a vector of numerous features. It is paramount to understand which information is actually necessary and which can be discarded, both for efficiency and explainability. The temperature is linked to the gradient activity due to the loss function minimization and leads to pruning of weight structures associated with small gradients. This study is done on different GNN architectures, one for node classification and another one for link prediction, and several benchmark datasets are employed. We compare the results with similar experiments previously conducted on the filters of Convolutional Neural Networks. Although still at the proof-of-concept stage, our temperature-based pruning technique stands as a promising alternative to state-of-the-art magnitude-based pruning techniques.</p></div>","PeriodicalId":792,"journal":{"name":"The European Physical Journal Plus","volume":"140 9","pages":""},"PeriodicalIF":2.9000,"publicationDate":"2025-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1140/epjp/s13360-025-06804-0.pdf","citationCount":"0","resultStr":"{\"title\":\"Temperature-based pruning for input features in Graph Neural Networks\",\"authors\":\"Lapenna Michela, Faglioni Francesco, Fioresi Rita, Bruno Giovanni\",\"doi\":\"10.1140/epjp/s13360-025-06804-0\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In the present work, we employ the concept of neural network temperature to prune unimportant features in input to a Graph Neural Network (GNN) architecture. In benchmark datasets for node and graph property prediction, each node comes equipped with a vector of numerous features. It is paramount to understand which information is actually necessary and which can be discarded, both for efficiency and explainability. The temperature is linked to the gradient activity due to the loss function minimization and leads to pruning of weight structures associated with small gradients. This study is done on different GNN architectures, one for node classification and another one for link prediction, and several benchmark datasets are employed. We compare the results with similar experiments previously conducted on the filters of Convolutional Neural Networks. Although still at the proof-of-concept stage, our temperature-based pruning technique stands as a promising alternative to state-of-the-art magnitude-based pruning techniques.</p></div>\",\"PeriodicalId\":792,\"journal\":{\"name\":\"The European Physical Journal Plus\",\"volume\":\"140 9\",\"pages\":\"\"},\"PeriodicalIF\":2.9000,\"publicationDate\":\"2025-09-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://link.springer.com/content/pdf/10.1140/epjp/s13360-025-06804-0.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The European Physical Journal Plus\",\"FirstCategoryId\":\"4\",\"ListUrlMain\":\"https://link.springer.com/article/10.1140/epjp/s13360-025-06804-0\",\"RegionNum\":3,\"RegionCategory\":\"物理与天体物理\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"PHYSICS, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The European Physical Journal Plus","FirstCategoryId":"4","ListUrlMain":"https://link.springer.com/article/10.1140/epjp/s13360-025-06804-0","RegionNum":3,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PHYSICS, MULTIDISCIPLINARY","Score":null,"Total":0}
Temperature-based pruning for input features in Graph Neural Networks
In the present work, we employ the concept of neural network temperature to prune unimportant features in input to a Graph Neural Network (GNN) architecture. In benchmark datasets for node and graph property prediction, each node comes equipped with a vector of numerous features. It is paramount to understand which information is actually necessary and which can be discarded, both for efficiency and explainability. The temperature is linked to the gradient activity due to the loss function minimization and leads to pruning of weight structures associated with small gradients. This study is done on different GNN architectures, one for node classification and another one for link prediction, and several benchmark datasets are employed. We compare the results with similar experiments previously conducted on the filters of Convolutional Neural Networks. Although still at the proof-of-concept stage, our temperature-based pruning technique stands as a promising alternative to state-of-the-art magnitude-based pruning techniques.
期刊介绍:
The aims of this peer-reviewed online journal are to distribute and archive all relevant material required to document, assess, validate and reconstruct in detail the body of knowledge in the physical and related sciences.
The scope of EPJ Plus encompasses a broad landscape of fields and disciplines in the physical and related sciences - such as covered by the topical EPJ journals and with the explicit addition of geophysics, astrophysics, general relativity and cosmology, mathematical and quantum physics, classical and fluid mechanics, accelerator and medical physics, as well as physics techniques applied to any other topics, including energy, environment and cultural heritage.