图神经网络在材料发现中的广泛注意机制

Guojing Cong, Talia Ben-Naim, Victor Fung, Anshul Gupta, R. Neumann, Mathias Steiner
{"title":"图神经网络在材料发现中的广泛注意机制","authors":"Guojing Cong, Talia Ben-Naim, Victor Fung, Anshul Gupta, R. Neumann, Mathias Steiner","doi":"10.1109/ICDMW58026.2022.00090","DOIUrl":null,"url":null,"abstract":"We present our research where attention mechanism is extensively applied to various aspects of graph neural net- works for predicting materials properties. As a result, surrogate models can not only replace costly simulations for materials screening but also formulate hypotheses and insights to guide further design exploration. We predict formation energy of the Materials Project and gas adsorption of crystalline adsorbents, and demonstrate the superior performance of our graph neural networks. Moreover, attention reveals important substructures that the machine learning models deem important for a material to achieve desired target properties. Our model is based solely on standard structural input files containing atomistic descriptions of the adsorbent material candidates. We construct novel methodological extensions to match the prediction accuracy of state-of-the-art models some of which were built with hundreds of features at much higher computational cost. We show that sophisticated neural networks can obviate the need for elaborate feature engineering. Our approach can be more broadly applied to optimize gas capture processes at industrial scale.","PeriodicalId":146687,"journal":{"name":"2022 IEEE International Conference on Data Mining Workshops (ICDMW)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Extensive Attention Mechanisms in Graph Neural Networks for Materials Discovery\",\"authors\":\"Guojing Cong, Talia Ben-Naim, Victor Fung, Anshul Gupta, R. Neumann, Mathias Steiner\",\"doi\":\"10.1109/ICDMW58026.2022.00090\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We present our research where attention mechanism is extensively applied to various aspects of graph neural net- works for predicting materials properties. As a result, surrogate models can not only replace costly simulations for materials screening but also formulate hypotheses and insights to guide further design exploration. We predict formation energy of the Materials Project and gas adsorption of crystalline adsorbents, and demonstrate the superior performance of our graph neural networks. Moreover, attention reveals important substructures that the machine learning models deem important for a material to achieve desired target properties. Our model is based solely on standard structural input files containing atomistic descriptions of the adsorbent material candidates. We construct novel methodological extensions to match the prediction accuracy of state-of-the-art models some of which were built with hundreds of features at much higher computational cost. We show that sophisticated neural networks can obviate the need for elaborate feature engineering. Our approach can be more broadly applied to optimize gas capture processes at industrial scale.\",\"PeriodicalId\":146687,\"journal\":{\"name\":\"2022 IEEE International Conference on Data Mining Workshops (ICDMW)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE International Conference on Data Mining Workshops (ICDMW)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICDMW58026.2022.00090\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Data Mining Workshops (ICDMW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDMW58026.2022.00090","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

我们介绍了我们的研究,其中注意机制被广泛应用于图神经网络预测材料性能的各个方面。因此,替代模型不仅可以取代昂贵的材料筛选模拟,还可以制定假设和见解,以指导进一步的设计探索。我们预测了材料项目的形成能和晶体吸附剂的气体吸附,并证明了我们的图神经网络的优越性能。此外,注意力揭示了重要的子结构,机器学习模型认为这些子结构对于材料实现所需的目标特性很重要。我们的模型完全基于包含吸附剂候选材料的原子描述的标准结构输入文件。我们构建了新的方法扩展,以匹配最先进的模型的预测精度,其中一些模型是用数百个特征构建的,计算成本要高得多。我们表明,复杂的神经网络可以避免复杂的特征工程的需要。我们的方法可以更广泛地应用于优化工业规模的气体捕获过程。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Extensive Attention Mechanisms in Graph Neural Networks for Materials Discovery
We present our research where attention mechanism is extensively applied to various aspects of graph neural net- works for predicting materials properties. As a result, surrogate models can not only replace costly simulations for materials screening but also formulate hypotheses and insights to guide further design exploration. We predict formation energy of the Materials Project and gas adsorption of crystalline adsorbents, and demonstrate the superior performance of our graph neural networks. Moreover, attention reveals important substructures that the machine learning models deem important for a material to achieve desired target properties. Our model is based solely on standard structural input files containing atomistic descriptions of the adsorbent material candidates. We construct novel methodological extensions to match the prediction accuracy of state-of-the-art models some of which were built with hundreds of features at much higher computational cost. We show that sophisticated neural networks can obviate the need for elaborate feature engineering. Our approach can be more broadly applied to optimize gas capture processes at industrial scale.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信