Neural network Gaussian processes as efficient models of potential energy surfaces for polyatomic molecules

IF 6.3 2区 物理与天体物理 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Jun Dai, Roman V Krems
{"title":"Neural network Gaussian processes as efficient models of potential energy surfaces for polyatomic molecules","authors":"Jun Dai, Roman V Krems","doi":"10.1088/2632-2153/ad0652","DOIUrl":null,"url":null,"abstract":"Abstract Kernel models of potential energy surfaces (PESs) for polyatomic molecules are often restricted by a specific choice of the kernel function. This can be avoided by optimizing the complexity of the kernel function. For regression problems with very expensive data, the functional form of the model kernels can be optimized in the Gaussian process (GP) setting through compositional function search guided by the Bayesian information criterion. However, the compositional kernel search is computationally demanding and relies on greedy strategies, which may yield sub-optimal kernels. An alternative strategy of increasing complexity of GP kernels treats a GP as a Bayesian neural network (NN) with a variable number of hidden layers, which yields NNGP models. Here, we present a direct comparison of GP models with composite kernels and NNGP models for applications aiming at the construction of global PES for polyatomic molecules. We show that NNGP models of PES can be trained much more efficiently and yield better generalization accuracy without relying on any specific form of the kernel function. We illustrate that NNGP models trained by distributions of energy points at low energies produce accurate predictions of PES at high energies. We also illustrate that NNGP models can extrapolate in the input variable space by building the free energy surface of the Heisenberg model trained in the paramagnetic phase and validated in the ferromagnetic phase. By construction, composite kernels yield more accurate models than kernels with a fixed functional form. Therefore, by illustrating that NNGP models outperform GP models with composite kernels, our work suggests that NNGP models should be a preferred choice of kernel models for PES.","PeriodicalId":33757,"journal":{"name":"Machine Learning Science and Technology","volume":"71 26","pages":"0"},"PeriodicalIF":6.3000,"publicationDate":"2023-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Machine Learning Science and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/2632-2153/ad0652","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Abstract Kernel models of potential energy surfaces (PESs) for polyatomic molecules are often restricted by a specific choice of the kernel function. This can be avoided by optimizing the complexity of the kernel function. For regression problems with very expensive data, the functional form of the model kernels can be optimized in the Gaussian process (GP) setting through compositional function search guided by the Bayesian information criterion. However, the compositional kernel search is computationally demanding and relies on greedy strategies, which may yield sub-optimal kernels. An alternative strategy of increasing complexity of GP kernels treats a GP as a Bayesian neural network (NN) with a variable number of hidden layers, which yields NNGP models. Here, we present a direct comparison of GP models with composite kernels and NNGP models for applications aiming at the construction of global PES for polyatomic molecules. We show that NNGP models of PES can be trained much more efficiently and yield better generalization accuracy without relying on any specific form of the kernel function. We illustrate that NNGP models trained by distributions of energy points at low energies produce accurate predictions of PES at high energies. We also illustrate that NNGP models can extrapolate in the input variable space by building the free energy surface of the Heisenberg model trained in the paramagnetic phase and validated in the ferromagnetic phase. By construction, composite kernels yield more accurate models than kernels with a fixed functional form. Therefore, by illustrating that NNGP models outperform GP models with composite kernels, our work suggests that NNGP models should be a preferred choice of kernel models for PES.
神经网络高斯过程作为多原子分子势能面的有效模型
摘要多原子分子势能面核模型常常受到核函数选择的限制。这可以通过优化核函数的复杂性来避免。对于数据非常昂贵的回归问题,可以通过贝叶斯信息准则指导下的组合函数搜索,在高斯过程(GP)设置下优化模型核的函数形式。然而,组合核搜索的计算量很大,并且依赖于贪婪策略,这可能会产生次优核。一种增加GP核复杂性的替代策略将GP视为具有可变隐藏层数的贝叶斯神经网络(NN),从而产生NNGP模型。在这里,我们将GP模型与复合核模型和NNGP模型进行了直接比较,用于构建多原子分子的全局PES。我们证明了PES的NNGP模型可以更有效地训练并产生更好的泛化精度,而不依赖于任何特定形式的核函数。我们证明了由低能量点分布训练的NNGP模型可以准确地预测高能量的PES。我们还通过建立在顺磁相位训练并在铁磁相位验证的海森堡模型的自由能面,说明NNGP模型可以在输入变量空间中进行外推。通过构造,复合核比具有固定函数形式的核产生更精确的模型。因此,通过说明NNGP模型优于具有复合核的GP模型,我们的工作表明NNGP模型应该是PES核模型的首选。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Machine Learning Science and Technology
Machine Learning Science and Technology Computer Science-Artificial Intelligence
CiteScore
9.10
自引率
4.40%
发文量
86
审稿时长
5 weeks
期刊介绍: Machine Learning Science and Technology is a multidisciplinary open access journal that bridges the application of machine learning across the sciences with advances in machine learning methods and theory as motivated by physical insights. Specifically, articles must fall into one of the following categories: advance the state of machine learning-driven applications in the sciences or make conceptual, methodological or theoretical advances in machine learning with applications to, inspiration from, or motivated by scientific problems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信