Establishing Deep InfoMax as an effective self-supervised learning methodology in materials informatics†

IF 6.2 Q1 CHEMISTRY, MULTIDISCIPLINARY
Michael Moran, Michael W. Gaultois, Vladimir V. Gusev, Dmytro Antypov and Matthew J. Rosseinsky
{"title":"Establishing Deep InfoMax as an effective self-supervised learning methodology in materials informatics†","authors":"Michael Moran, Michael W. Gaultois, Vladimir V. Gusev, Dmytro Antypov and Matthew J. Rosseinsky","doi":"10.1039/D4DD00202D","DOIUrl":null,"url":null,"abstract":"<p >The scarcity of property labels remains a key challenge in materials informatics, whereas materials data without property labels are abundant in comparison. By pre-training supervised property prediction models on self-supervised tasks that depend only on the “intrinsic information” available in any Crystallographic Information File (CIF), there is potential to leverage the large amount of crystal data without property labels to improve property prediction results on small datasets. We apply Deep InfoMax as a self-supervised machine learning framework for materials informatics that explicitly maximises the mutual information between a point set (or graph) representation of a crystal and a vector representation suitable for downstream learning. This allows the pre-training of supervised models on large materials datasets without the need for property labels and without requiring the model to reconstruct the crystal from a representation vector. We investigate the benefits of Deep InfoMax pre-training implemented on the Site-Net architecture to improve the performance of downstream property prediction models with small amounts (&lt;10<small><sup>3</sup></small>) of data, a situation relevant to experimentally measured materials property databases. Using a property label masking methodology, where we perform self-supervised learning on larger supervised datasets and then train supervised models on a small subset of the labels, we isolate Deep InfoMax pre-training from the effects of distributional shift. We demonstrate performance improvements in the contexts of representation learning and transfer learning on the tasks of band gap and formation energy prediction. Having established the effectiveness of Deep InfoMax pre-training in a controlled environment, our findings provide a foundation for extending the approach to address practical challenges in materials informatics.</p>","PeriodicalId":72816,"journal":{"name":"Digital discovery","volume":" 3","pages":" 790-811"},"PeriodicalIF":6.2000,"publicationDate":"2025-01-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://pubs.rsc.org/en/content/articlepdf/2025/dd/d4dd00202d?page=search","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Digital discovery","FirstCategoryId":"1085","ListUrlMain":"https://pubs.rsc.org/en/content/articlelanding/2025/dd/d4dd00202d","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CHEMISTRY, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

The scarcity of property labels remains a key challenge in materials informatics, whereas materials data without property labels are abundant in comparison. By pre-training supervised property prediction models on self-supervised tasks that depend only on the “intrinsic information” available in any Crystallographic Information File (CIF), there is potential to leverage the large amount of crystal data without property labels to improve property prediction results on small datasets. We apply Deep InfoMax as a self-supervised machine learning framework for materials informatics that explicitly maximises the mutual information between a point set (or graph) representation of a crystal and a vector representation suitable for downstream learning. This allows the pre-training of supervised models on large materials datasets without the need for property labels and without requiring the model to reconstruct the crystal from a representation vector. We investigate the benefits of Deep InfoMax pre-training implemented on the Site-Net architecture to improve the performance of downstream property prediction models with small amounts (<103) of data, a situation relevant to experimentally measured materials property databases. Using a property label masking methodology, where we perform self-supervised learning on larger supervised datasets and then train supervised models on a small subset of the labels, we isolate Deep InfoMax pre-training from the effects of distributional shift. We demonstrate performance improvements in the contexts of representation learning and transfer learning on the tasks of band gap and formation energy prediction. Having established the effectiveness of Deep InfoMax pre-training in a controlled environment, our findings provide a foundation for extending the approach to address practical challenges in materials informatics.

Abstract Image

在材料信息学中建立有效的自监督学习方法——深度信息max
属性标签的稀缺性仍然是材料信息学的一个关键挑战,而相比之下,没有属性标签的材料数据丰富。通过在仅依赖于任何晶体信息文件(CIF)中可用的“固有信息”的自监督任务上预训练有监督的属性预测模型,有可能利用大量没有属性标签的晶体数据来改善小数据集的属性预测结果。我们将Deep InfoMax应用于材料信息学的自监督机器学习框架,该框架明确地最大化晶体的点集(或图)表示与适合下游学习的向量表示之间的互信息。这允许在不需要属性标签的情况下对大型材料数据集进行监督模型的预训练,也不需要模型从表示向量重建晶体。我们研究了在Site-Net架构上实施的Deep InfoMax预训练的好处,以提高使用少量(<103)数据的下游属性预测模型的性能,这与实验测量的材料属性数据库有关。使用属性标签屏蔽方法,我们在较大的监督数据集上执行自监督学习,然后在一小部分标签上训练监督模型,我们将Deep InfoMax预训练与分布移位的影响隔离开来。我们展示了在表征学习和迁移学习的背景下,在带隙和地层能量预测任务上的性能改进。在受控环境中建立了Deep InfoMax预训练的有效性,我们的研究结果为扩展该方法以解决材料信息学中的实际挑战提供了基础。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
2.80
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信