利用卷积闪存融合网络进行电池容量估算的资源节约型人工智能

IF 15 1区 工程技术 Q1 ENERGY & FUELS
Zhilong Lv , Jingyuan Zhao
{"title":"利用卷积闪存融合网络进行电池容量估算的资源节约型人工智能","authors":"Zhilong Lv ,&nbsp;Jingyuan Zhao","doi":"10.1016/j.etran.2024.100383","DOIUrl":null,"url":null,"abstract":"<div><div>Accurate battery capacity estimation is crucial for optimizing lifespan and monitoring health conditions. Deep learning has made notable strides in addressing long-standing issues in the artificial intelligence community. However, large AI models often face challenges such as high computational resource consumption, extended training times, and elevated deployment costs. To address these issues, we developed an efficient end-to-end hybrid fusion neural network model. This model combines FlashAttention-2 with local feature extraction through convolutional neural networks (CNNs), significantly reducing memory usage and computational demands while maintaining precise and efficient health estimation. For practical implementation, the model uses only basic parameters, such as voltage and charge, and employs partial charging data (from 80 % SOC to the upper limit voltage) as features, without requiring complex feature engineering. We evaluated the model using three datasets: 77 lithium iron phosphate (LFP) cells, 16 nickel cobalt aluminum (NCA) cells, and 50 nickel cobalt manganese (NCM) oxide cells. For LFP battery health estimation, the model achieved a root mean square error of 0.109 %, a coefficient of determination of 0.99, and a mean absolute percentage error of 0.096 %. Moreover, the proposed convolutional and flash-attention fusion networks deliver an average inference time of 57 milliseconds for health diagnosis across the full battery life cycle (approximately 1898 cycles per cell). The resource-efficient AI (REAI) model operates at an average of 1.36 billion floating point operations per second (FLOPs), with GPU power consumption of 17W and memory usage of 403 MB. This significantly outperforms the Transformer model with vanilla attention. Furthermore, the multi-fusion model proved to be a powerful tool for evaluating capacity in NCA and NCM cells using transfer learning. The results emphasize its ability to reduce computational complexity, energy consumption, and memory usage, while maintaining high accuracy and robust generalization capabilities.</div></div>","PeriodicalId":36355,"journal":{"name":"Etransportation","volume":"23 ","pages":"Article 100383"},"PeriodicalIF":15.0000,"publicationDate":"2024-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Resource-efficient artificial intelligence for battery capacity estimation using convolutional FlashAttention fusion networks\",\"authors\":\"Zhilong Lv ,&nbsp;Jingyuan Zhao\",\"doi\":\"10.1016/j.etran.2024.100383\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Accurate battery capacity estimation is crucial for optimizing lifespan and monitoring health conditions. Deep learning has made notable strides in addressing long-standing issues in the artificial intelligence community. However, large AI models often face challenges such as high computational resource consumption, extended training times, and elevated deployment costs. To address these issues, we developed an efficient end-to-end hybrid fusion neural network model. This model combines FlashAttention-2 with local feature extraction through convolutional neural networks (CNNs), significantly reducing memory usage and computational demands while maintaining precise and efficient health estimation. For practical implementation, the model uses only basic parameters, such as voltage and charge, and employs partial charging data (from 80 % SOC to the upper limit voltage) as features, without requiring complex feature engineering. We evaluated the model using three datasets: 77 lithium iron phosphate (LFP) cells, 16 nickel cobalt aluminum (NCA) cells, and 50 nickel cobalt manganese (NCM) oxide cells. For LFP battery health estimation, the model achieved a root mean square error of 0.109 %, a coefficient of determination of 0.99, and a mean absolute percentage error of 0.096 %. Moreover, the proposed convolutional and flash-attention fusion networks deliver an average inference time of 57 milliseconds for health diagnosis across the full battery life cycle (approximately 1898 cycles per cell). The resource-efficient AI (REAI) model operates at an average of 1.36 billion floating point operations per second (FLOPs), with GPU power consumption of 17W and memory usage of 403 MB. This significantly outperforms the Transformer model with vanilla attention. Furthermore, the multi-fusion model proved to be a powerful tool for evaluating capacity in NCA and NCM cells using transfer learning. The results emphasize its ability to reduce computational complexity, energy consumption, and memory usage, while maintaining high accuracy and robust generalization capabilities.</div></div>\",\"PeriodicalId\":36355,\"journal\":{\"name\":\"Etransportation\",\"volume\":\"23 \",\"pages\":\"Article 100383\"},\"PeriodicalIF\":15.0000,\"publicationDate\":\"2024-11-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Etransportation\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2590116824000730\",\"RegionNum\":1,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENERGY & FUELS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Etransportation","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2590116824000730","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENERGY & FUELS","Score":null,"Total":0}
引用次数: 0

摘要

准确估算电池容量对于优化电池寿命和监控电池健康状况至关重要。深度学习在解决人工智能界长期存在的问题方面取得了显著进展。然而,大型人工智能模型往往面临计算资源消耗大、训练时间长、部署成本高等挑战。为了解决这些问题,我们开发了一种高效的端到端混合融合神经网络模型。该模型将 FlashAttention-2 与卷积神经网络(CNN)的局部特征提取相结合,大大降低了内存使用量和计算需求,同时保持了精确高效的健康估计。在实际应用中,该模型仅使用电压和电量等基本参数,并采用部分充电数据(从 80% SOC 到上限电压)作为特征,无需复杂的特征工程。我们使用三个数据集对该模型进行了评估:77 个磷酸铁锂(LFP)电池、16 个镍钴铝(NCA)电池和 50 个镍钴锰(NCM)氧化物电池。在锂铁磷酸盐电池健康评估方面,该模型的均方根误差为 0.109%,决定系数为 0.99,平均绝对百分比误差为 0.096%。此外,所提出的卷积和闪存融合网络在整个电池生命周期(每个电池约 1898 个周期)的健康诊断中,平均推理时间为 57 毫秒。资源节约型人工智能(REAI)模型的平均运行速度为每秒 13.6 亿次浮点运算(FLOPs),GPU 功耗为 17W,内存使用量为 403 MB。这明显优于使用 vanilla 注意力的 Transformer 模型。此外,事实证明多融合模型是利用迁移学习评估 NCA 和 NCM 单元容量的强大工具。结果表明,该模型能够降低计算复杂度、能耗和内存使用量,同时保持高精度和强大的泛化能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Resource-efficient artificial intelligence for battery capacity estimation using convolutional FlashAttention fusion networks
Accurate battery capacity estimation is crucial for optimizing lifespan and monitoring health conditions. Deep learning has made notable strides in addressing long-standing issues in the artificial intelligence community. However, large AI models often face challenges such as high computational resource consumption, extended training times, and elevated deployment costs. To address these issues, we developed an efficient end-to-end hybrid fusion neural network model. This model combines FlashAttention-2 with local feature extraction through convolutional neural networks (CNNs), significantly reducing memory usage and computational demands while maintaining precise and efficient health estimation. For practical implementation, the model uses only basic parameters, such as voltage and charge, and employs partial charging data (from 80 % SOC to the upper limit voltage) as features, without requiring complex feature engineering. We evaluated the model using three datasets: 77 lithium iron phosphate (LFP) cells, 16 nickel cobalt aluminum (NCA) cells, and 50 nickel cobalt manganese (NCM) oxide cells. For LFP battery health estimation, the model achieved a root mean square error of 0.109 %, a coefficient of determination of 0.99, and a mean absolute percentage error of 0.096 %. Moreover, the proposed convolutional and flash-attention fusion networks deliver an average inference time of 57 milliseconds for health diagnosis across the full battery life cycle (approximately 1898 cycles per cell). The resource-efficient AI (REAI) model operates at an average of 1.36 billion floating point operations per second (FLOPs), with GPU power consumption of 17W and memory usage of 403 MB. This significantly outperforms the Transformer model with vanilla attention. Furthermore, the multi-fusion model proved to be a powerful tool for evaluating capacity in NCA and NCM cells using transfer learning. The results emphasize its ability to reduce computational complexity, energy consumption, and memory usage, while maintaining high accuracy and robust generalization capabilities.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Etransportation
Etransportation Engineering-Automotive Engineering
CiteScore
19.80
自引率
12.60%
发文量
57
审稿时长
39 days
期刊介绍: eTransportation is a scholarly journal that aims to advance knowledge in the field of electric transportation. It focuses on all modes of transportation that utilize electricity as their primary source of energy, including electric vehicles, trains, ships, and aircraft. The journal covers all stages of research, development, and testing of new technologies, systems, and devices related to electrical transportation. The journal welcomes the use of simulation and analysis tools at the system, transport, or device level. Its primary emphasis is on the study of the electrical and electronic aspects of transportation systems. However, it also considers research on mechanical parts or subsystems of vehicles if there is a clear interaction with electrical or electronic equipment. Please note that this journal excludes other aspects such as sociological, political, regulatory, or environmental factors from its scope.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信