{"title":"利用 Kolmogorov-Arnold 网络高效预测势能面和物理特性","authors":"Rui Wang, Hongyu Yu, Yang Zhong, Hongjun Xiang","doi":"arxiv-2409.03430","DOIUrl":null,"url":null,"abstract":"The application of machine learning methodologies for predicting properties\nwithin materials science has garnered significant attention. Among recent\nadvancements, Kolmogorov-Arnold Networks (KANs) have emerged as a promising\nalternative to traditional Multi-Layer Perceptrons (MLPs). This study evaluates\nthe impact of substituting MLPs with KANs within three established machine\nlearning frameworks: Allegro, Neural Equivariant Interatomic Potentials\n(NequIP), and the Edge-Based Tensor Prediction Graph Neural Network (ETGNN).\nOur results demonstrate that the integration of KANs generally yields enhanced\nprediction accuracies. Specifically, replacing MLPs with KANs in the output\nblocks leads to notable improvements in accuracy and, in certain scenarios,\nalso results in reduced training times. Furthermore, employing KANs exclusively\nin the output block facilitates faster inference and improved computational\nefficiency relative to utilizing KANs throughout the entire model. The\nselection of an optimal basis function for KANs is found to be contingent upon\nthe particular problem at hand. Our results demonstrate the strong potential of\nKANs in enhancing machine learning potentials and material property\npredictions.","PeriodicalId":501369,"journal":{"name":"arXiv - PHYS - Computational Physics","volume":"26 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Efficient prediction of potential energy surface and physical properties with Kolmogorov-Arnold Networks\",\"authors\":\"Rui Wang, Hongyu Yu, Yang Zhong, Hongjun Xiang\",\"doi\":\"arxiv-2409.03430\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The application of machine learning methodologies for predicting properties\\nwithin materials science has garnered significant attention. Among recent\\nadvancements, Kolmogorov-Arnold Networks (KANs) have emerged as a promising\\nalternative to traditional Multi-Layer Perceptrons (MLPs). This study evaluates\\nthe impact of substituting MLPs with KANs within three established machine\\nlearning frameworks: Allegro, Neural Equivariant Interatomic Potentials\\n(NequIP), and the Edge-Based Tensor Prediction Graph Neural Network (ETGNN).\\nOur results demonstrate that the integration of KANs generally yields enhanced\\nprediction accuracies. Specifically, replacing MLPs with KANs in the output\\nblocks leads to notable improvements in accuracy and, in certain scenarios,\\nalso results in reduced training times. Furthermore, employing KANs exclusively\\nin the output block facilitates faster inference and improved computational\\nefficiency relative to utilizing KANs throughout the entire model. The\\nselection of an optimal basis function for KANs is found to be contingent upon\\nthe particular problem at hand. Our results demonstrate the strong potential of\\nKANs in enhancing machine learning potentials and material property\\npredictions.\",\"PeriodicalId\":501369,\"journal\":{\"name\":\"arXiv - PHYS - Computational Physics\",\"volume\":\"26 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - PHYS - Computational Physics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.03430\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - PHYS - Computational Physics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.03430","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
摘要
机器学习方法在材料科学特性预测中的应用已引起广泛关注。在最近取得的进展中,Kolmogorov-Arnold 网络(KANs)已成为替代传统多层感知器(MLPs)的一种有前途的方法。本研究评估了在三个成熟的机器学习框架内用 KAN 替代 MLP 的影响:我们的研究结果表明,KANs 的集成通常能提高预测精度。具体来说,在输出块中用 KAN 替代 MLP,可以显著提高准确率,在某些情况下还能缩短训练时间。此外,与在整个模型中使用 KAN 相比,仅在输出块中使用 KAN 可以加快推理速度并提高计算效率。我们发现,为 KANs 选择最佳基函数取决于手头的特定问题。我们的研究结果证明了 KANs 在增强机器学习潜力和材料特性预测方面的强大潜力。
Efficient prediction of potential energy surface and physical properties with Kolmogorov-Arnold Networks
The application of machine learning methodologies for predicting properties
within materials science has garnered significant attention. Among recent
advancements, Kolmogorov-Arnold Networks (KANs) have emerged as a promising
alternative to traditional Multi-Layer Perceptrons (MLPs). This study evaluates
the impact of substituting MLPs with KANs within three established machine
learning frameworks: Allegro, Neural Equivariant Interatomic Potentials
(NequIP), and the Edge-Based Tensor Prediction Graph Neural Network (ETGNN).
Our results demonstrate that the integration of KANs generally yields enhanced
prediction accuracies. Specifically, replacing MLPs with KANs in the output
blocks leads to notable improvements in accuracy and, in certain scenarios,
also results in reduced training times. Furthermore, employing KANs exclusively
in the output block facilitates faster inference and improved computational
efficiency relative to utilizing KANs throughout the entire model. The
selection of an optimal basis function for KANs is found to be contingent upon
the particular problem at hand. Our results demonstrate the strong potential of
KANs in enhancing machine learning potentials and material property
predictions.