{"title":"InvarNet:通过旋转不变图神经网络预测分子特性","authors":"Danyan Chen , Gaoxiang Duan , Dengbao Miao , Xiaoying Zheng , Yongxin Zhu","doi":"10.1016/j.mlwa.2024.100587","DOIUrl":null,"url":null,"abstract":"<div><div>Predicting molecular properties is crucial in drug synthesis and screening, but traditional molecular dynamics methods are time-consuming and costly. Recently, deep learning methods, particularly Graph Neural Networks (GNNs), have significantly improved efficiency by capturing molecular structures’ invariance under translation, rotation, and permutation. However, current GNN methods require complex data processing, increasing algorithmic complexity. This high complexity leads to several challenges, including increased computation time, higher computational resource demands, increased memory consumption. This paper introduces InvarNet, a GNN-based model trained with a composite loss function that bypasses intricate data processing while maintaining molecular property invariance. By pre-storing atomic feature attributes, InvarNet avoids repeated feature extraction during forward propagation. Experiments on three public datasets (Electronic Materials, QM9, and MD17) demonstrate that InvarNet achieves superior prediction accuracy, excellent stability, and convergence speed. It reaches state-of-the-art performance on the Electronic Materials dataset and outperforms existing models on the <span><math><msup><mrow><mi>R</mi></mrow><mrow><mn>2</mn></mrow></msup></math></span> and <span><math><mrow><mi>a</mi><mi>l</mi><mi>p</mi><mi>h</mi><mi>a</mi></mrow></math></span> properties of the QM9 dataset. On the MD17 dataset, InvarNet excels in energy prediction of benzene without atomic force. Additionally, InvarNet accelerates training time per epoch by 2.24 times compared to SphereNet on the QM9 dataset, simplifying data processing while maintaining acceptable accuracy.</div></div>","PeriodicalId":74093,"journal":{"name":"Machine learning with applications","volume":"18 ","pages":"Article 100587"},"PeriodicalIF":0.0000,"publicationDate":"2024-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"InvarNet: Molecular property prediction via rotation invariant graph neural networks\",\"authors\":\"Danyan Chen , Gaoxiang Duan , Dengbao Miao , Xiaoying Zheng , Yongxin Zhu\",\"doi\":\"10.1016/j.mlwa.2024.100587\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Predicting molecular properties is crucial in drug synthesis and screening, but traditional molecular dynamics methods are time-consuming and costly. Recently, deep learning methods, particularly Graph Neural Networks (GNNs), have significantly improved efficiency by capturing molecular structures’ invariance under translation, rotation, and permutation. However, current GNN methods require complex data processing, increasing algorithmic complexity. This high complexity leads to several challenges, including increased computation time, higher computational resource demands, increased memory consumption. This paper introduces InvarNet, a GNN-based model trained with a composite loss function that bypasses intricate data processing while maintaining molecular property invariance. By pre-storing atomic feature attributes, InvarNet avoids repeated feature extraction during forward propagation. Experiments on three public datasets (Electronic Materials, QM9, and MD17) demonstrate that InvarNet achieves superior prediction accuracy, excellent stability, and convergence speed. It reaches state-of-the-art performance on the Electronic Materials dataset and outperforms existing models on the <span><math><msup><mrow><mi>R</mi></mrow><mrow><mn>2</mn></mrow></msup></math></span> and <span><math><mrow><mi>a</mi><mi>l</mi><mi>p</mi><mi>h</mi><mi>a</mi></mrow></math></span> properties of the QM9 dataset. On the MD17 dataset, InvarNet excels in energy prediction of benzene without atomic force. Additionally, InvarNet accelerates training time per epoch by 2.24 times compared to SphereNet on the QM9 dataset, simplifying data processing while maintaining acceptable accuracy.</div></div>\",\"PeriodicalId\":74093,\"journal\":{\"name\":\"Machine learning with applications\",\"volume\":\"18 \",\"pages\":\"Article 100587\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Machine learning with applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S266682702400063X\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Machine learning with applications","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S266682702400063X","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
InvarNet: Molecular property prediction via rotation invariant graph neural networks
Predicting molecular properties is crucial in drug synthesis and screening, but traditional molecular dynamics methods are time-consuming and costly. Recently, deep learning methods, particularly Graph Neural Networks (GNNs), have significantly improved efficiency by capturing molecular structures’ invariance under translation, rotation, and permutation. However, current GNN methods require complex data processing, increasing algorithmic complexity. This high complexity leads to several challenges, including increased computation time, higher computational resource demands, increased memory consumption. This paper introduces InvarNet, a GNN-based model trained with a composite loss function that bypasses intricate data processing while maintaining molecular property invariance. By pre-storing atomic feature attributes, InvarNet avoids repeated feature extraction during forward propagation. Experiments on three public datasets (Electronic Materials, QM9, and MD17) demonstrate that InvarNet achieves superior prediction accuracy, excellent stability, and convergence speed. It reaches state-of-the-art performance on the Electronic Materials dataset and outperforms existing models on the and properties of the QM9 dataset. On the MD17 dataset, InvarNet excels in energy prediction of benzene without atomic force. Additionally, InvarNet accelerates training time per epoch by 2.24 times compared to SphereNet on the QM9 dataset, simplifying data processing while maintaining acceptable accuracy.