Rapid seismic response prediction of steel frames based on Graph Attention Network (GAT) with Mamba blocks

IF 4.3 2区 工程技术 Q1 ENGINEERING, CIVIL
Hong Peng , Weifa Zheng , Ying Yu , Yaozhi Luo
{"title":"Rapid seismic response prediction of steel frames based on Graph Attention Network (GAT) with Mamba blocks","authors":"Hong Peng ,&nbsp;Weifa Zheng ,&nbsp;Ying Yu ,&nbsp;Yaozhi Luo","doi":"10.1016/j.istruc.2025.109390","DOIUrl":null,"url":null,"abstract":"<div><div>Dynamic response prediction is a critical aspect of seismic analysis for structural systems, traditionally addressed through numerical methods. With recent advancements in deep learning, these techniques have emerged as a promising alternative for predicting structural responses. However, designing a neural network that effectively captures the complex variations in structural geometries, parameters, and nonlinear behaviors remains a significant challenge. This study introduces the GAT-Mamba neural network, an innovative framework that integrates Graph Attention Networks with Mamba blocks to enhance nonlinear dynamic analysis. The Graph Attention Networks derive graph embeddings informed by structural properties, while Mamba blocks use deep state-space models to predict response histories. The GAT-Mamba model accurately forecasts the nonlinear responses of randomly generated steel frames under significant ground motions such as acceleration, velocity, displacement, shear force, and bending moments. It addresses a key limitation of many existing deep learning approaches by predicting dynamic responses across diverse structural configurations without retraining, thereby expanding the potential of structural surrogate models for design and analysis. Experimental results demonstrate that GAT-Mamba achieves a substantial reduction in both parameter count and FLOPs compared to all baseline approaches. During deep learning training, which is typically the most computationally demanding stage, the GAT-Mamba method achieves exceptional efficiency, operating about 27 times faster than GAT-LSTM. The model’s predicted seismic time-history responses closely match actual numerical results, even in challenging scenarios. This work advances computational efficiency in structural analysis, paving the way for future hybrid models that integrate physical insights with data-driven approaches.</div></div>","PeriodicalId":48642,"journal":{"name":"Structures","volume":"79 ","pages":"Article 109390"},"PeriodicalIF":4.3000,"publicationDate":"2025-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Structures","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352012425012056","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, CIVIL","Score":null,"Total":0}
引用次数: 0

Abstract

Dynamic response prediction is a critical aspect of seismic analysis for structural systems, traditionally addressed through numerical methods. With recent advancements in deep learning, these techniques have emerged as a promising alternative for predicting structural responses. However, designing a neural network that effectively captures the complex variations in structural geometries, parameters, and nonlinear behaviors remains a significant challenge. This study introduces the GAT-Mamba neural network, an innovative framework that integrates Graph Attention Networks with Mamba blocks to enhance nonlinear dynamic analysis. The Graph Attention Networks derive graph embeddings informed by structural properties, while Mamba blocks use deep state-space models to predict response histories. The GAT-Mamba model accurately forecasts the nonlinear responses of randomly generated steel frames under significant ground motions such as acceleration, velocity, displacement, shear force, and bending moments. It addresses a key limitation of many existing deep learning approaches by predicting dynamic responses across diverse structural configurations without retraining, thereby expanding the potential of structural surrogate models for design and analysis. Experimental results demonstrate that GAT-Mamba achieves a substantial reduction in both parameter count and FLOPs compared to all baseline approaches. During deep learning training, which is typically the most computationally demanding stage, the GAT-Mamba method achieves exceptional efficiency, operating about 27 times faster than GAT-LSTM. The model’s predicted seismic time-history responses closely match actual numerical results, even in challenging scenarios. This work advances computational efficiency in structural analysis, paving the way for future hybrid models that integrate physical insights with data-driven approaches.
基于曼巴块图关注网络(GAT)的钢框架地震反应快速预测
动力响应预测是结构系统地震分析的一个重要方面,传统上是通过数值方法解决的。随着深度学习的最新进展,这些技术已经成为预测结构响应的一种有希望的替代方法。然而,设计一个有效捕获结构几何、参数和非线性行为的复杂变化的神经网络仍然是一个重大挑战。本研究引入了gat -曼巴神经网络,这是一个创新的框架,将图注意网络与曼巴块集成在一起,以增强非线性动态分析。图注意网络根据结构属性派生图嵌入,而曼巴块使用深度状态空间模型来预测响应历史。GAT-Mamba模型准确地预测了随机生成的钢框架在加速度、速度、位移、剪力和弯矩等显著地面运动下的非线性响应。它通过预测不同结构配置的动态响应而无需再训练,从而解决了许多现有深度学习方法的一个关键限制,从而扩大了结构替代模型用于设计和分析的潜力。实验结果表明,与所有基线方法相比,GAT-Mamba在参数计数和FLOPs方面都实现了大幅降低。在深度学习训练阶段,这通常是最需要计算量的阶段,GAT-Mamba方法获得了卓越的效率,比GAT-LSTM快27倍。该模型预测的地震时程响应与实际数值结果非常吻合,即使在具有挑战性的情况下也是如此。这项工作提高了结构分析的计算效率,为未来将物理洞察力与数据驱动方法相结合的混合模型铺平了道路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Structures
Structures Engineering-Architecture
CiteScore
5.70
自引率
17.10%
发文量
1187
期刊介绍: Structures aims to publish internationally-leading research across the full breadth of structural engineering. Papers for Structures are particularly welcome in which high-quality research will benefit from wide readership of academics and practitioners such that not only high citation rates but also tangible industrial-related pathways to impact are achieved.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信