{"title":"Multi-Grid Graph Neural Networks with Self-Attention for Computational Mechanics","authors":"Paul Garnier, Jonathan Viquerat, Elie Hachem","doi":"arxiv-2409.11899","DOIUrl":null,"url":null,"abstract":"Advancement in finite element methods have become essential in various\ndisciplines, and in particular for Computational Fluid Dynamics (CFD), driving\nresearch efforts for improved precision and efficiency. While Convolutional\nNeural Networks (CNNs) have found success in CFD by mapping meshes into images,\nrecent attention has turned to leveraging Graph Neural Networks (GNNs) for\ndirect mesh processing. This paper introduces a novel model merging\nSelf-Attention with Message Passing in GNNs, achieving a 15\\% reduction in RMSE\non the well known flow past a cylinder benchmark. Furthermore, a dynamic mesh\npruning technique based on Self-Attention is proposed, that leads to a robust\nGNN-based multigrid approach, also reducing RMSE by 15\\%. Additionally, a new\nself-supervised training method based on BERT is presented, resulting in a 25\\%\nRMSE reduction. The paper includes an ablation study and outperforms\nstate-of-the-art models on several challenging datasets, promising advancements\nsimilar to those recently achieved in natural language and image processing.\nFinally, the paper introduces a dataset with meshes larger than existing ones\nby at least an order of magnitude. Code and Datasets will be released at\nhttps://github.com/DonsetPG/multigrid-gnn.","PeriodicalId":501301,"journal":{"name":"arXiv - CS - Machine Learning","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11899","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Advancement in finite element methods have become essential in various
disciplines, and in particular for Computational Fluid Dynamics (CFD), driving
research efforts for improved precision and efficiency. While Convolutional
Neural Networks (CNNs) have found success in CFD by mapping meshes into images,
recent attention has turned to leveraging Graph Neural Networks (GNNs) for
direct mesh processing. This paper introduces a novel model merging
Self-Attention with Message Passing in GNNs, achieving a 15\% reduction in RMSE
on the well known flow past a cylinder benchmark. Furthermore, a dynamic mesh
pruning technique based on Self-Attention is proposed, that leads to a robust
GNN-based multigrid approach, also reducing RMSE by 15\%. Additionally, a new
self-supervised training method based on BERT is presented, resulting in a 25\%
RMSE reduction. The paper includes an ablation study and outperforms
state-of-the-art models on several challenging datasets, promising advancements
similar to those recently achieved in natural language and image processing.
Finally, the paper introduces a dataset with meshes larger than existing ones
by at least an order of magnitude. Code and Datasets will be released at
https://github.com/DonsetPG/multigrid-gnn.