{"title":"利用卷积闪存融合网络进行电池容量估算的资源节约型人工智能","authors":"Zhilong Lv , Jingyuan Zhao","doi":"10.1016/j.etran.2024.100383","DOIUrl":null,"url":null,"abstract":"<div><div>Accurate battery capacity estimation is crucial for optimizing lifespan and monitoring health conditions. Deep learning has made notable strides in addressing long-standing issues in the artificial intelligence community. However, large AI models often face challenges such as high computational resource consumption, extended training times, and elevated deployment costs. To address these issues, we developed an efficient end-to-end hybrid fusion neural network model. This model combines FlashAttention-2 with local feature extraction through convolutional neural networks (CNNs), significantly reducing memory usage and computational demands while maintaining precise and efficient health estimation. For practical implementation, the model uses only basic parameters, such as voltage and charge, and employs partial charging data (from 80 % SOC to the upper limit voltage) as features, without requiring complex feature engineering. We evaluated the model using three datasets: 77 lithium iron phosphate (LFP) cells, 16 nickel cobalt aluminum (NCA) cells, and 50 nickel cobalt manganese (NCM) oxide cells. For LFP battery health estimation, the model achieved a root mean square error of 0.109 %, a coefficient of determination of 0.99, and a mean absolute percentage error of 0.096 %. Moreover, the proposed convolutional and flash-attention fusion networks deliver an average inference time of 57 milliseconds for health diagnosis across the full battery life cycle (approximately 1898 cycles per cell). The resource-efficient AI (REAI) model operates at an average of 1.36 billion floating point operations per second (FLOPs), with GPU power consumption of 17W and memory usage of 403 MB. This significantly outperforms the Transformer model with vanilla attention. Furthermore, the multi-fusion model proved to be a powerful tool for evaluating capacity in NCA and NCM cells using transfer learning. The results emphasize its ability to reduce computational complexity, energy consumption, and memory usage, while maintaining high accuracy and robust generalization capabilities.</div></div>","PeriodicalId":36355,"journal":{"name":"Etransportation","volume":"23 ","pages":"Article 100383"},"PeriodicalIF":15.0000,"publicationDate":"2024-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Resource-efficient artificial intelligence for battery capacity estimation using convolutional FlashAttention fusion networks\",\"authors\":\"Zhilong Lv , Jingyuan Zhao\",\"doi\":\"10.1016/j.etran.2024.100383\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Accurate battery capacity estimation is crucial for optimizing lifespan and monitoring health conditions. Deep learning has made notable strides in addressing long-standing issues in the artificial intelligence community. However, large AI models often face challenges such as high computational resource consumption, extended training times, and elevated deployment costs. To address these issues, we developed an efficient end-to-end hybrid fusion neural network model. This model combines FlashAttention-2 with local feature extraction through convolutional neural networks (CNNs), significantly reducing memory usage and computational demands while maintaining precise and efficient health estimation. For practical implementation, the model uses only basic parameters, such as voltage and charge, and employs partial charging data (from 80 % SOC to the upper limit voltage) as features, without requiring complex feature engineering. We evaluated the model using three datasets: 77 lithium iron phosphate (LFP) cells, 16 nickel cobalt aluminum (NCA) cells, and 50 nickel cobalt manganese (NCM) oxide cells. For LFP battery health estimation, the model achieved a root mean square error of 0.109 %, a coefficient of determination of 0.99, and a mean absolute percentage error of 0.096 %. Moreover, the proposed convolutional and flash-attention fusion networks deliver an average inference time of 57 milliseconds for health diagnosis across the full battery life cycle (approximately 1898 cycles per cell). The resource-efficient AI (REAI) model operates at an average of 1.36 billion floating point operations per second (FLOPs), with GPU power consumption of 17W and memory usage of 403 MB. This significantly outperforms the Transformer model with vanilla attention. Furthermore, the multi-fusion model proved to be a powerful tool for evaluating capacity in NCA and NCM cells using transfer learning. The results emphasize its ability to reduce computational complexity, energy consumption, and memory usage, while maintaining high accuracy and robust generalization capabilities.</div></div>\",\"PeriodicalId\":36355,\"journal\":{\"name\":\"Etransportation\",\"volume\":\"23 \",\"pages\":\"Article 100383\"},\"PeriodicalIF\":15.0000,\"publicationDate\":\"2024-11-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Etransportation\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2590116824000730\",\"RegionNum\":1,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENERGY & FUELS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Etransportation","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2590116824000730","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENERGY & FUELS","Score":null,"Total":0}
Resource-efficient artificial intelligence for battery capacity estimation using convolutional FlashAttention fusion networks
Accurate battery capacity estimation is crucial for optimizing lifespan and monitoring health conditions. Deep learning has made notable strides in addressing long-standing issues in the artificial intelligence community. However, large AI models often face challenges such as high computational resource consumption, extended training times, and elevated deployment costs. To address these issues, we developed an efficient end-to-end hybrid fusion neural network model. This model combines FlashAttention-2 with local feature extraction through convolutional neural networks (CNNs), significantly reducing memory usage and computational demands while maintaining precise and efficient health estimation. For practical implementation, the model uses only basic parameters, such as voltage and charge, and employs partial charging data (from 80 % SOC to the upper limit voltage) as features, without requiring complex feature engineering. We evaluated the model using three datasets: 77 lithium iron phosphate (LFP) cells, 16 nickel cobalt aluminum (NCA) cells, and 50 nickel cobalt manganese (NCM) oxide cells. For LFP battery health estimation, the model achieved a root mean square error of 0.109 %, a coefficient of determination of 0.99, and a mean absolute percentage error of 0.096 %. Moreover, the proposed convolutional and flash-attention fusion networks deliver an average inference time of 57 milliseconds for health diagnosis across the full battery life cycle (approximately 1898 cycles per cell). The resource-efficient AI (REAI) model operates at an average of 1.36 billion floating point operations per second (FLOPs), with GPU power consumption of 17W and memory usage of 403 MB. This significantly outperforms the Transformer model with vanilla attention. Furthermore, the multi-fusion model proved to be a powerful tool for evaluating capacity in NCA and NCM cells using transfer learning. The results emphasize its ability to reduce computational complexity, energy consumption, and memory usage, while maintaining high accuracy and robust generalization capabilities.
期刊介绍:
eTransportation is a scholarly journal that aims to advance knowledge in the field of electric transportation. It focuses on all modes of transportation that utilize electricity as their primary source of energy, including electric vehicles, trains, ships, and aircraft. The journal covers all stages of research, development, and testing of new technologies, systems, and devices related to electrical transportation.
The journal welcomes the use of simulation and analysis tools at the system, transport, or device level. Its primary emphasis is on the study of the electrical and electronic aspects of transportation systems. However, it also considers research on mechanical parts or subsystems of vehicles if there is a clear interaction with electrical or electronic equipment.
Please note that this journal excludes other aspects such as sociological, political, regulatory, or environmental factors from its scope.