{"title":"一个受神经网络启发的化学动力学公式","authors":"S. Barwey, V. Raman","doi":"10.3390/en14092710","DOIUrl":null,"url":null,"abstract":"A method which casts the chemical source term computation into an artificial neural network (ANN)-inspired form is presented. This approach is well-suited for use on emerging supercomputing platforms that rely on graphical processing units (GPUs). The resulting equations allow for a GPU-friendly matrix-multiplication based source term estimation where the leading dimension (batch size) can be interpreted as the number of chemically reacting cells in the domain; as such, the approach can be readily adapted in high-fidelity solvers for which an MPI rank offloads the source term computation task for a given number of cells to the GPU. Though the exact ANN-inspired recasting shown here is optimal for GPU environments as-is, this interpretation allows the user to replace portions of the exact routine with trained, so-called approximate ANNs, where the goal of these approximate ANNs is to increase computational efficiency over the exact routine counterparts. Note that the main objective of this paper is not to use machine learning for developing models, but rather to represent chemical kinetics using the ANN framework. The end result is that little-to-no training is needed, and the GPU-friendly structure of the ANN formulation during the source term computation is preserved. The method is demonstrated using chemical mechanisms of varying complexity on both 0-D auto-ignition and 1-D channel detonation problems, and the details of performance on GPUs are explored.","PeriodicalId":8439,"journal":{"name":"arXiv: Chemical Physics","volume":"139 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2020-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"A Neural Network Inspired Formulation of Chemical Kinetics\",\"authors\":\"S. Barwey, V. Raman\",\"doi\":\"10.3390/en14092710\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A method which casts the chemical source term computation into an artificial neural network (ANN)-inspired form is presented. This approach is well-suited for use on emerging supercomputing platforms that rely on graphical processing units (GPUs). The resulting equations allow for a GPU-friendly matrix-multiplication based source term estimation where the leading dimension (batch size) can be interpreted as the number of chemically reacting cells in the domain; as such, the approach can be readily adapted in high-fidelity solvers for which an MPI rank offloads the source term computation task for a given number of cells to the GPU. Though the exact ANN-inspired recasting shown here is optimal for GPU environments as-is, this interpretation allows the user to replace portions of the exact routine with trained, so-called approximate ANNs, where the goal of these approximate ANNs is to increase computational efficiency over the exact routine counterparts. Note that the main objective of this paper is not to use machine learning for developing models, but rather to represent chemical kinetics using the ANN framework. The end result is that little-to-no training is needed, and the GPU-friendly structure of the ANN formulation during the source term computation is preserved. The method is demonstrated using chemical mechanisms of varying complexity on both 0-D auto-ignition and 1-D channel detonation problems, and the details of performance on GPUs are explored.\",\"PeriodicalId\":8439,\"journal\":{\"name\":\"arXiv: Chemical Physics\",\"volume\":\"139 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-07-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv: Chemical Physics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3390/en14092710\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv: Chemical Physics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/en14092710","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Neural Network Inspired Formulation of Chemical Kinetics
A method which casts the chemical source term computation into an artificial neural network (ANN)-inspired form is presented. This approach is well-suited for use on emerging supercomputing platforms that rely on graphical processing units (GPUs). The resulting equations allow for a GPU-friendly matrix-multiplication based source term estimation where the leading dimension (batch size) can be interpreted as the number of chemically reacting cells in the domain; as such, the approach can be readily adapted in high-fidelity solvers for which an MPI rank offloads the source term computation task for a given number of cells to the GPU. Though the exact ANN-inspired recasting shown here is optimal for GPU environments as-is, this interpretation allows the user to replace portions of the exact routine with trained, so-called approximate ANNs, where the goal of these approximate ANNs is to increase computational efficiency over the exact routine counterparts. Note that the main objective of this paper is not to use machine learning for developing models, but rather to represent chemical kinetics using the ANN framework. The end result is that little-to-no training is needed, and the GPU-friendly structure of the ANN formulation during the source term computation is preserved. The method is demonstrated using chemical mechanisms of varying complexity on both 0-D auto-ignition and 1-D channel detonation problems, and the details of performance on GPUs are explored.