{"title":"Performance analysis of activation functions in molecular property prediction using Message Passing Graph Neural Networks","authors":"Garima Chanana","doi":"10.1016/j.chemphys.2024.112591","DOIUrl":null,"url":null,"abstract":"<div><div>Deep learning has significantly advanced molecular property prediction, with Message-Passing Graph Neural Networks (MPGNN) standing out as an effective method. This study systematically evaluates the performance of ten activation functions — Sigmoid, Tanh, ReLU, Leaky ReLU, ELU, SELU, Softmax, Swish, Mish, and GeLU — using the MPGNN model on the QM9 dataset. It aims to identify the most suitable activation functions for training neural networks for specific molecular properties. The study examines electronic properties such as HOMO, LUMO, HOMO-LUMO energy gap, dipole moment, and polarizability, as well as thermal properties like zero-point vibrational energy and specific heat capacity. The findings reveal that different activation functions excel for different properties: SELU for HOMO, ELU for LUMO, Sigmoid for the HOMO-LUMO gap, Mish for polarizability, GELU for ZPVE, and Leaky ReLU for dipole moment and specific heat capacity. These insights are vital for optimizing MPGNN design for targeted molecular property prediction.</div></div>","PeriodicalId":272,"journal":{"name":"Chemical Physics","volume":"591 ","pages":"Article 112591"},"PeriodicalIF":2.0000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chemical Physics","FirstCategoryId":"92","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0301010424004208","RegionNum":3,"RegionCategory":"化学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"CHEMISTRY, PHYSICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Deep learning has significantly advanced molecular property prediction, with Message-Passing Graph Neural Networks (MPGNN) standing out as an effective method. This study systematically evaluates the performance of ten activation functions — Sigmoid, Tanh, ReLU, Leaky ReLU, ELU, SELU, Softmax, Swish, Mish, and GeLU — using the MPGNN model on the QM9 dataset. It aims to identify the most suitable activation functions for training neural networks for specific molecular properties. The study examines electronic properties such as HOMO, LUMO, HOMO-LUMO energy gap, dipole moment, and polarizability, as well as thermal properties like zero-point vibrational energy and specific heat capacity. The findings reveal that different activation functions excel for different properties: SELU for HOMO, ELU for LUMO, Sigmoid for the HOMO-LUMO gap, Mish for polarizability, GELU for ZPVE, and Leaky ReLU for dipole moment and specific heat capacity. These insights are vital for optimizing MPGNN design for targeted molecular property prediction.
期刊介绍:
Chemical Physics publishes experimental and theoretical papers on all aspects of chemical physics. In this journal, experiments are related to theory, and in turn theoretical papers are related to present or future experiments. Subjects covered include: spectroscopy and molecular structure, interacting systems, relaxation phenomena, biological systems, materials, fundamental problems in molecular reactivity, molecular quantum theory and statistical mechanics. Computational chemistry studies of routine character are not appropriate for this journal.