{"title":"Rapid seismic response prediction of steel frames based on Graph Attention Network (GAT) with Mamba blocks","authors":"Hong Peng , Weifa Zheng , Ying Yu , Yaozhi Luo","doi":"10.1016/j.istruc.2025.109390","DOIUrl":null,"url":null,"abstract":"<div><div>Dynamic response prediction is a critical aspect of seismic analysis for structural systems, traditionally addressed through numerical methods. With recent advancements in deep learning, these techniques have emerged as a promising alternative for predicting structural responses. However, designing a neural network that effectively captures the complex variations in structural geometries, parameters, and nonlinear behaviors remains a significant challenge. This study introduces the GAT-Mamba neural network, an innovative framework that integrates Graph Attention Networks with Mamba blocks to enhance nonlinear dynamic analysis. The Graph Attention Networks derive graph embeddings informed by structural properties, while Mamba blocks use deep state-space models to predict response histories. The GAT-Mamba model accurately forecasts the nonlinear responses of randomly generated steel frames under significant ground motions such as acceleration, velocity, displacement, shear force, and bending moments. It addresses a key limitation of many existing deep learning approaches by predicting dynamic responses across diverse structural configurations without retraining, thereby expanding the potential of structural surrogate models for design and analysis. Experimental results demonstrate that GAT-Mamba achieves a substantial reduction in both parameter count and FLOPs compared to all baseline approaches. During deep learning training, which is typically the most computationally demanding stage, the GAT-Mamba method achieves exceptional efficiency, operating about 27 times faster than GAT-LSTM. The model’s predicted seismic time-history responses closely match actual numerical results, even in challenging scenarios. This work advances computational efficiency in structural analysis, paving the way for future hybrid models that integrate physical insights with data-driven approaches.</div></div>","PeriodicalId":48642,"journal":{"name":"Structures","volume":"79 ","pages":"Article 109390"},"PeriodicalIF":4.3000,"publicationDate":"2025-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Structures","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352012425012056","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, CIVIL","Score":null,"Total":0}
引用次数: 0
Abstract
Dynamic response prediction is a critical aspect of seismic analysis for structural systems, traditionally addressed through numerical methods. With recent advancements in deep learning, these techniques have emerged as a promising alternative for predicting structural responses. However, designing a neural network that effectively captures the complex variations in structural geometries, parameters, and nonlinear behaviors remains a significant challenge. This study introduces the GAT-Mamba neural network, an innovative framework that integrates Graph Attention Networks with Mamba blocks to enhance nonlinear dynamic analysis. The Graph Attention Networks derive graph embeddings informed by structural properties, while Mamba blocks use deep state-space models to predict response histories. The GAT-Mamba model accurately forecasts the nonlinear responses of randomly generated steel frames under significant ground motions such as acceleration, velocity, displacement, shear force, and bending moments. It addresses a key limitation of many existing deep learning approaches by predicting dynamic responses across diverse structural configurations without retraining, thereby expanding the potential of structural surrogate models for design and analysis. Experimental results demonstrate that GAT-Mamba achieves a substantial reduction in both parameter count and FLOPs compared to all baseline approaches. During deep learning training, which is typically the most computationally demanding stage, the GAT-Mamba method achieves exceptional efficiency, operating about 27 times faster than GAT-LSTM. The model’s predicted seismic time-history responses closely match actual numerical results, even in challenging scenarios. This work advances computational efficiency in structural analysis, paving the way for future hybrid models that integrate physical insights with data-driven approaches.
期刊介绍:
Structures aims to publish internationally-leading research across the full breadth of structural engineering. Papers for Structures are particularly welcome in which high-quality research will benefit from wide readership of academics and practitioners such that not only high citation rates but also tangible industrial-related pathways to impact are achieved.