Jakob Gamper , Hans Georg Gallmetzer , Alexander K.H. Weiss , Thomas S. Hofer
{"title":"A general strategy for improving the performance of PINNs -- Analytical gradients and advanced optimizers in the NeuralSchrödinger framework","authors":"Jakob Gamper , Hans Georg Gallmetzer , Alexander K.H. Weiss , Thomas S. Hofer","doi":"10.1016/j.aichem.2024.100047","DOIUrl":null,"url":null,"abstract":"<div><p>In this work, the previously introduced NeuralSchrödinger PINN is extended towards the use of analytical gradient expressions of the loss function. It is shown that the analytical gradients derived in this work increase the convergence properties for both the BFGS and ADAM optimizers compared to the previously employed numerical gradient implementation. In addition, the use of parallelised GPU computations <em>via</em> CUDA greatly increased the computational performance over the previous implementation using single-core CPU computations. As a consequence, an extension of the NeuralSchrödinger PINN towards two-dimensional quantum systems became feasible as also demonstrated in this work.</p></div>","PeriodicalId":72302,"journal":{"name":"Artificial intelligence chemistry","volume":"2 1","pages":"Article 100047"},"PeriodicalIF":0.0000,"publicationDate":"2024-01-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2949747724000058/pdfft?md5=a75b7d5a2a3dee17cd82180444493827&pid=1-s2.0-S2949747724000058-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Artificial intelligence chemistry","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2949747724000058","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In this work, the previously introduced NeuralSchrödinger PINN is extended towards the use of analytical gradient expressions of the loss function. It is shown that the analytical gradients derived in this work increase the convergence properties for both the BFGS and ADAM optimizers compared to the previously employed numerical gradient implementation. In addition, the use of parallelised GPU computations via CUDA greatly increased the computational performance over the previous implementation using single-core CPU computations. As a consequence, an extension of the NeuralSchrödinger PINN towards two-dimensional quantum systems became feasible as also demonstrated in this work.
在这项工作中,先前介绍的神经薛定谔 PINN 被扩展到使用损失函数的分析梯度表达式。结果表明,与之前使用的数值梯度实现相比,本研究中得出的分析梯度提高了 BFGS 和 ADAM 优化器的收敛特性。此外,通过 CUDA 使用 GPU 并行计算,与之前使用单核 CPU 计算相比,大大提高了计算性能。因此,将神经薛定谔 PINN 扩展到二维量子系统是可行的,这也在本研究中得到了证明。