Qi Yu, Ruitao Ma, Chen Qu, Riccardo Conte, Apurba Nandi, Priyanka Pandey, Paul L Houston, Dong H Zhang, Joel M Bowman
{"title":"Extending atomic decomposition and many-body representation with a chemistry-motivated approach to machine learning potentials.","authors":"Qi Yu, Ruitao Ma, Chen Qu, Riccardo Conte, Apurba Nandi, Priyanka Pandey, Paul L Houston, Dong H Zhang, Joel M Bowman","doi":"10.1038/s43588-025-00790-0","DOIUrl":null,"url":null,"abstract":"<p><p>Most widely used machine learning potentials for condensed-phase applications rely on many-body permutationally invariant polynomial or atom-centered neural networks. However, these approaches face challenges in achieving chemical interpretability in atomistic energy decomposition and fully matching the computational efficiency of traditional force fields. Here we present a method that combines aspects of both approaches and balances accuracy and force-field-level speed. This method utilizes a monomer-centered representation, where the potential energy is decomposed into the sum of chemically meaningful monomeric energies. The structural descriptors of monomers are described by one-body and two-body effective interactions, enforced by appropriate sets of permutationally invariant polynomials as inputs to the feed-forward neural networks. Systematic assessments of models for gas-phase water trimer, liquid water, methane-water cluster and liquid carbon dioxide are performed. The improved accuracy, efficiency and flexibility of this method have promise for constructing accurate machine learning potentials and enabling large-scale quantum and classical simulations for complex molecular systems.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0000,"publicationDate":"2025-04-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Nature computational science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1038/s43588-025-00790-0","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
Most widely used machine learning potentials for condensed-phase applications rely on many-body permutationally invariant polynomial or atom-centered neural networks. However, these approaches face challenges in achieving chemical interpretability in atomistic energy decomposition and fully matching the computational efficiency of traditional force fields. Here we present a method that combines aspects of both approaches and balances accuracy and force-field-level speed. This method utilizes a monomer-centered representation, where the potential energy is decomposed into the sum of chemically meaningful monomeric energies. The structural descriptors of monomers are described by one-body and two-body effective interactions, enforced by appropriate sets of permutationally invariant polynomials as inputs to the feed-forward neural networks. Systematic assessments of models for gas-phase water trimer, liquid water, methane-water cluster and liquid carbon dioxide are performed. The improved accuracy, efficiency and flexibility of this method have promise for constructing accurate machine learning potentials and enabling large-scale quantum and classical simulations for complex molecular systems.