All-in-one foundational models learning across quantum chemical levels

Pavlo O., Dral, Yuxinxin, Chen
{"title":"All-in-one foundational models learning across quantum chemical levels","authors":"Pavlo O., Dral, Yuxinxin, Chen","doi":"10.26434/chemrxiv-2024-ng3ws","DOIUrl":null,"url":null,"abstract":"Machine learning (ML) potentials typically target a single quantum chemical (QC) level while the ML models developed for multi-fidelity learning have not been shown to provide scalable solutions for foundational models. Here we introduce the all-in-one (AIO) ANI model architecture based on multimodal learning which can learn an arbitrary number of QC levels. Our all-in-one learning approach offers a more general and easier-to-use alternative to transfer learning. We use it to train the AIO-ANI-UIP foundational model with the generalization capability comparable to semi-empirical GFN2-xTB and DFT with a double-zeta basis set for organic molecules. We show that the AIO-ANI model can learn across different QC levels ranging from semi-empirical to density functional theory to coupled cluster. We also use AIO models to design the foundational model Δ-AIO-ANI based on Δ-learning with increased accuracy and robustness compared to AIO-ANI-UIP. The code and the foundational models are available at https://github.com/dralgroup/aio-ani; they will be integrated into the universal and updatable AI-enhanced QM (UAIQM) library and made available in the MLatom package so that they can be used online at the XACS cloud computing platform (see https://github.com/dralgroup/mlatom for updates).","PeriodicalId":9813,"journal":{"name":"ChemRxiv","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ChemRxiv","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.26434/chemrxiv-2024-ng3ws","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Machine learning (ML) potentials typically target a single quantum chemical (QC) level while the ML models developed for multi-fidelity learning have not been shown to provide scalable solutions for foundational models. Here we introduce the all-in-one (AIO) ANI model architecture based on multimodal learning which can learn an arbitrary number of QC levels. Our all-in-one learning approach offers a more general and easier-to-use alternative to transfer learning. We use it to train the AIO-ANI-UIP foundational model with the generalization capability comparable to semi-empirical GFN2-xTB and DFT with a double-zeta basis set for organic molecules. We show that the AIO-ANI model can learn across different QC levels ranging from semi-empirical to density functional theory to coupled cluster. We also use AIO models to design the foundational model Δ-AIO-ANI based on Δ-learning with increased accuracy and robustness compared to AIO-ANI-UIP. The code and the foundational models are available at https://github.com/dralgroup/aio-ani; they will be integrated into the universal and updatable AI-enhanced QM (UAIQM) library and made available in the MLatom package so that they can be used online at the XACS cloud computing platform (see https://github.com/dralgroup/mlatom for updates).
跨量子化学层次的一体化基础模型学习
机器学习(ML)潜力通常以单一量子化学(QC)水平为目标,而为多保真度学习开发的 ML 模型尚未证明能为基础模型提供可扩展的解决方案。在此,我们介绍基于多模态学习的一体化(AIO)ANI 模型架构,它可以学习任意数量的 QC 级别。我们的一体化学习方法为迁移学习提供了一种更通用、更易用的替代方案。我们用它来训练 AIO-ANI-UIP 基础模型,其泛化能力可与半经验 GFN2-xTB 和使用双 Zeta 基集的 DFT 有机分子相媲美。我们的研究表明,AIO-ANI 模型可以在从半经验到密度泛函理论再到耦合簇的不同 QC 水平上进行学习。我们还利用 AIO 模型设计了基于 Δ-learning 的基础模型 Δ-AIO-ANI,与 AIO-ANI-UIP 相比,其准确性和鲁棒性都有所提高。代码和基础模型可在 https://github.com/dralgroup/aio-ani 上获得;它们将被集成到通用的、可更新的人工智能增强 QM(UAIQM)库中,并在 MLatom 软件包中提供,以便在 XACS 云计算平台上在线使用(更新信息见 https://github.com/dralgroup/mlatom)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信