利用注意力增强多模态时间网络预测车辆碰撞中乘员反应曲线

IF 5.7 1区 工程技术 Q1 ERGONOMICS
Wenjie Wang , Xiaoyi Tai , Chang Zhou , Zhao Liu , Ping Zhu
{"title":"利用注意力增强多模态时间网络预测车辆碰撞中乘员反应曲线","authors":"Wenjie Wang ,&nbsp;Xiaoyi Tai ,&nbsp;Chang Zhou ,&nbsp;Zhao Liu ,&nbsp;Ping Zhu","doi":"10.1016/j.aap.2025.108140","DOIUrl":null,"url":null,"abstract":"<div><div>Accurately predicting safety responses, especially occupant crash response curves across multiple body regions, plays a crucial role in advancing vehicle crash safety by enabling design optimization and reducing the reliance on costly physical testing and simulations. Machine learning methods have demonstrated good performance in this field, but existing approaches often face challenges in integrating multimodal data and handling multi-task temporal predictions. To address these issues, this work proposes a novel Attention-enhanced Multimodal Temporal Network (AMTN) for predicting occupant crash response curves across multiple body regions during crashes. AMTN integrates numerical parameters and vehicle body crash pulses through a feature extraction module, organically fuses multimodal features via cross-attention mechanisms, and decodes shared features using a modified Temporal Convolutional Network (TCN) with local sliding self-attention. A dynamic adaptive loss and multiple output layers are utilized to evaluate the importance of each task, iteratively update the learning priorities, and finally achieve multi-task prediction. Experiments on the engineering-obtained data demonstrate that the crash response curves predicted by AMTN achieves an average ISO (International Organization of Standards) rating of 0.835 across 11 crash response curves, with critical regions exceeding 0.9. In engineering applications, an ISO rating greater than 0.8 indicates that the predicted curve closely matches the reference curve. Therefore, the experimental results demonstrate that the proposed method has effectively learned the characteristics of the training data and is capable of producing accurate predictions. In summary, this work advances multimodal deep learning for crash safety by enabling efficient, accurate, and interpretable multi-task curve predictions. Consequently, it can be applied to data-driven vehicle safety development, offering significant engineering value by enhancing both development efficiency and quality.</div></div>","PeriodicalId":6926,"journal":{"name":"Accident; analysis and prevention","volume":"220 ","pages":"Article 108140"},"PeriodicalIF":5.7000,"publicationDate":"2025-06-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Predicting occupant response curves in vehicle crashes via Attention-enhanced multimodal temporal Network\",\"authors\":\"Wenjie Wang ,&nbsp;Xiaoyi Tai ,&nbsp;Chang Zhou ,&nbsp;Zhao Liu ,&nbsp;Ping Zhu\",\"doi\":\"10.1016/j.aap.2025.108140\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Accurately predicting safety responses, especially occupant crash response curves across multiple body regions, plays a crucial role in advancing vehicle crash safety by enabling design optimization and reducing the reliance on costly physical testing and simulations. Machine learning methods have demonstrated good performance in this field, but existing approaches often face challenges in integrating multimodal data and handling multi-task temporal predictions. To address these issues, this work proposes a novel Attention-enhanced Multimodal Temporal Network (AMTN) for predicting occupant crash response curves across multiple body regions during crashes. AMTN integrates numerical parameters and vehicle body crash pulses through a feature extraction module, organically fuses multimodal features via cross-attention mechanisms, and decodes shared features using a modified Temporal Convolutional Network (TCN) with local sliding self-attention. A dynamic adaptive loss and multiple output layers are utilized to evaluate the importance of each task, iteratively update the learning priorities, and finally achieve multi-task prediction. Experiments on the engineering-obtained data demonstrate that the crash response curves predicted by AMTN achieves an average ISO (International Organization of Standards) rating of 0.835 across 11 crash response curves, with critical regions exceeding 0.9. In engineering applications, an ISO rating greater than 0.8 indicates that the predicted curve closely matches the reference curve. Therefore, the experimental results demonstrate that the proposed method has effectively learned the characteristics of the training data and is capable of producing accurate predictions. In summary, this work advances multimodal deep learning for crash safety by enabling efficient, accurate, and interpretable multi-task curve predictions. Consequently, it can be applied to data-driven vehicle safety development, offering significant engineering value by enhancing both development efficiency and quality.</div></div>\",\"PeriodicalId\":6926,\"journal\":{\"name\":\"Accident; analysis and prevention\",\"volume\":\"220 \",\"pages\":\"Article 108140\"},\"PeriodicalIF\":5.7000,\"publicationDate\":\"2025-06-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Accident; analysis and prevention\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S000145752500226X\",\"RegionNum\":1,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ERGONOMICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Accident; analysis and prevention","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S000145752500226X","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ERGONOMICS","Score":null,"Total":0}
引用次数: 0

摘要

准确预测安全响应,特别是跨多个身体区域的乘员碰撞响应曲线,通过实现设计优化和减少对昂贵的物理测试和模拟的依赖,在提高车辆碰撞安全性方面发挥着至关重要的作用。机器学习方法在该领域表现良好,但现有方法在集成多模态数据和处理多任务时间预测方面经常面临挑战。为了解决这些问题,本研究提出了一种新颖的注意力增强多模态时间网络(AMTN),用于预测碰撞期间乘员在多个身体区域的碰撞响应曲线。AMTN通过特征提取模块集成数值参数和车身碰撞脉冲,通过交叉注意机制有机融合多模态特征,并使用具有局部滑动自关注的改进时间卷积网络(TCN)对共享特征进行解码。利用动态自适应损失和多输出层来评估每个任务的重要性,迭代更新学习优先级,最终实现多任务预测。工程实测数据实验表明,AMTN预测的11条碰撞响应曲线的ISO (International Organization of Standards)平均评分为0.835,临界区域超过0.9。在工程应用中,ISO值大于0.8表示预测曲线与参考曲线非常匹配。因此,实验结果表明,该方法有效地学习了训练数据的特征,能够产生准确的预测。总之,这项工作通过实现高效、准确和可解释的多任务曲线预测,推进了碰撞安全的多模态深度学习。因此,它可以应用于数据驱动的车辆安全开发,提高开发效率和质量,具有重要的工程价值。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Predicting occupant response curves in vehicle crashes via Attention-enhanced multimodal temporal Network
Accurately predicting safety responses, especially occupant crash response curves across multiple body regions, plays a crucial role in advancing vehicle crash safety by enabling design optimization and reducing the reliance on costly physical testing and simulations. Machine learning methods have demonstrated good performance in this field, but existing approaches often face challenges in integrating multimodal data and handling multi-task temporal predictions. To address these issues, this work proposes a novel Attention-enhanced Multimodal Temporal Network (AMTN) for predicting occupant crash response curves across multiple body regions during crashes. AMTN integrates numerical parameters and vehicle body crash pulses through a feature extraction module, organically fuses multimodal features via cross-attention mechanisms, and decodes shared features using a modified Temporal Convolutional Network (TCN) with local sliding self-attention. A dynamic adaptive loss and multiple output layers are utilized to evaluate the importance of each task, iteratively update the learning priorities, and finally achieve multi-task prediction. Experiments on the engineering-obtained data demonstrate that the crash response curves predicted by AMTN achieves an average ISO (International Organization of Standards) rating of 0.835 across 11 crash response curves, with critical regions exceeding 0.9. In engineering applications, an ISO rating greater than 0.8 indicates that the predicted curve closely matches the reference curve. Therefore, the experimental results demonstrate that the proposed method has effectively learned the characteristics of the training data and is capable of producing accurate predictions. In summary, this work advances multimodal deep learning for crash safety by enabling efficient, accurate, and interpretable multi-task curve predictions. Consequently, it can be applied to data-driven vehicle safety development, offering significant engineering value by enhancing both development efficiency and quality.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
11.90
自引率
16.90%
发文量
264
审稿时长
48 days
期刊介绍: Accident Analysis & Prevention provides wide coverage of the general areas relating to accidental injury and damage, including the pre-injury and immediate post-injury phases. Published papers deal with medical, legal, economic, educational, behavioral, theoretical or empirical aspects of transportation accidents, as well as with accidents at other sites. Selected topics within the scope of the Journal may include: studies of human, environmental and vehicular factors influencing the occurrence, type and severity of accidents and injury; the design, implementation and evaluation of countermeasures; biomechanics of impact and human tolerance limits to injury; modelling and statistical analysis of accident data; policy, planning and decision-making in safety.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信