All-in-one foundational models learning across quantum chemical levels

Yuxinxin Chen, Pavlo O. Dral
{"title":"All-in-one foundational models learning across quantum chemical levels","authors":"Yuxinxin Chen, Pavlo O. Dral","doi":"arxiv-2409.12015","DOIUrl":null,"url":null,"abstract":"Machine learning (ML) potentials typically target a single quantum chemical\n(QC) level while the ML models developed for multi-fidelity learning have not\nbeen shown to provide scalable solutions for foundational models. Here we\nintroduce the all-in-one (AIO) ANI model architecture based on multimodal\nlearning which can learn an arbitrary number of QC levels. Our all-in-one\nlearning approach offers a more general and easier-to-use alternative to\ntransfer learning. We use it to train the AIO-ANI-UIP foundational model with\nthe generalization capability comparable to semi-empirical GFN2-xTB and DFT\nwith a double-zeta basis set for organic molecules. We show that the AIO-ANI\nmodel can learn across different QC levels ranging from semi-empirical to\ndensity functional theory to coupled cluster. We also use AIO models to design\nthe foundational model {\\Delta}-AIO-ANI based on {\\Delta}-learning with\nincreased accuracy and robustness compared to AIO-ANI-UIP. The code and the\nfoundational models are available at https://github.com/dralgroup/aio-ani; they\nwill be integrated into the universal and updatable AI-enhanced QM (UAIQM)\nlibrary and made available in the MLatom package so that they can be used\nonline at the XACS cloud computing platform (see\nhttps://github.com/dralgroup/mlatom for updates).","PeriodicalId":501304,"journal":{"name":"arXiv - PHYS - Chemical Physics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - PHYS - Chemical Physics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.12015","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Machine learning (ML) potentials typically target a single quantum chemical (QC) level while the ML models developed for multi-fidelity learning have not been shown to provide scalable solutions for foundational models. Here we introduce the all-in-one (AIO) ANI model architecture based on multimodal learning which can learn an arbitrary number of QC levels. Our all-in-one learning approach offers a more general and easier-to-use alternative to transfer learning. We use it to train the AIO-ANI-UIP foundational model with the generalization capability comparable to semi-empirical GFN2-xTB and DFT with a double-zeta basis set for organic molecules. We show that the AIO-ANI model can learn across different QC levels ranging from semi-empirical to density functional theory to coupled cluster. We also use AIO models to design the foundational model {\Delta}-AIO-ANI based on {\Delta}-learning with increased accuracy and robustness compared to AIO-ANI-UIP. The code and the foundational models are available at https://github.com/dralgroup/aio-ani; they will be integrated into the universal and updatable AI-enhanced QM (UAIQM) library and made available in the MLatom package so that they can be used online at the XACS cloud computing platform (see https://github.com/dralgroup/mlatom for updates).
跨量子化学层次的一体化基础模型学习
机器学习(ML)潜力通常以单一量子化学(QC)水平为目标,而为多保真度学习开发的 ML 模型尚未被证明能为基础模型提供可扩展的解决方案。在此,我们介绍了基于多模态学习的一体式(AIO)ANI 模型架构,它可以学习任意数量的 QC 级别。我们的一体化学习方法提供了一种更通用、更易用的方法来替代转移学习。我们用它来训练 AIO-ANI-UIP 基础模型,其泛化能力可与半经验 GFN2-xTB 和使用双 Zeta 基集的 DFT 有机分子相媲美。我们的研究表明,AIO-ANI 模型可以在从半经验到密度泛函理论再到耦合簇的不同质量控制水平上进行学习。与 AIO-ANI-UIP 相比,我们还使用 AIO 模型设计了基于 {\Delta}-learning 的基础模型 {\Delta}-AIO-ANI,提高了准确性和鲁棒性。代码和基础模型可在https://github.com/dralgroup/aio-ani;它们将被集成到通用的、可更新的人工智能增强QM(UAIQM)库中,并在MLatom包中提供,以便在XACS云计算平台上在线使用(更新信息见https://github.com/dralgroup/mlatom)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信