Ruizhi Hun, Song Zhang, Gurjeet Singh, Jian Qian, Yun Chen, Patrick Chiang
{"title":"一个可转移的轻型多头自关注神经网络模型,用于锂离子电池的充电状态估计","authors":"Ruizhi Hun, Song Zhang, Gurjeet Singh, Jian Qian, Yun Chen, Patrick Chiang","doi":"10.1109/SPIES52282.2021.9633780","DOIUrl":null,"url":null,"abstract":"Deep learning has drawn wide attention in state-of-charge (SOC) estimation for batteries in electric vehicles, owning to its model-free feature and powerful ability of nonlinear regression. However, previous studies focused on improving estimation accuracy, with less research devoted to computational efficiency and hardware implementation. This paper proposes a light-weight multi-head self-attention neural network with only 298 trainable parameters and 1.18K floating point operations (FLOPs), while achieving 2.10% RMSE at all conditions. This proposed self-attention and simple Recurrent Neural Networks (RNN) can operate independently in the battery model, and can be greatly accelerated by on-chip parallel execution. Furthermore, a transfer learning (TL) method was applied to the proposed model. The proposed model with transfer learning performs significantly better on small training subsets and stopped with fewer epochs using an early stopping strategy. To the best of our knowledge, it is the first model using Transformer and self-attention for SOC estimation.","PeriodicalId":411512,"journal":{"name":"2021 3rd International Conference on Smart Power & Internet Energy Systems (SPIES)","volume":"262 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"LISA: A Transferable Light-weight Multi-Head Self-Attention Neural Network Model for Lithium-Ion Batteries State-of-Charge Estimation\",\"authors\":\"Ruizhi Hun, Song Zhang, Gurjeet Singh, Jian Qian, Yun Chen, Patrick Chiang\",\"doi\":\"10.1109/SPIES52282.2021.9633780\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deep learning has drawn wide attention in state-of-charge (SOC) estimation for batteries in electric vehicles, owning to its model-free feature and powerful ability of nonlinear regression. However, previous studies focused on improving estimation accuracy, with less research devoted to computational efficiency and hardware implementation. This paper proposes a light-weight multi-head self-attention neural network with only 298 trainable parameters and 1.18K floating point operations (FLOPs), while achieving 2.10% RMSE at all conditions. This proposed self-attention and simple Recurrent Neural Networks (RNN) can operate independently in the battery model, and can be greatly accelerated by on-chip parallel execution. Furthermore, a transfer learning (TL) method was applied to the proposed model. The proposed model with transfer learning performs significantly better on small training subsets and stopped with fewer epochs using an early stopping strategy. To the best of our knowledge, it is the first model using Transformer and self-attention for SOC estimation.\",\"PeriodicalId\":411512,\"journal\":{\"name\":\"2021 3rd International Conference on Smart Power & Internet Energy Systems (SPIES)\",\"volume\":\"262 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-09-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 3rd International Conference on Smart Power & Internet Energy Systems (SPIES)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SPIES52282.2021.9633780\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 3rd International Conference on Smart Power & Internet Energy Systems (SPIES)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SPIES52282.2021.9633780","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
LISA: A Transferable Light-weight Multi-Head Self-Attention Neural Network Model for Lithium-Ion Batteries State-of-Charge Estimation
Deep learning has drawn wide attention in state-of-charge (SOC) estimation for batteries in electric vehicles, owning to its model-free feature and powerful ability of nonlinear regression. However, previous studies focused on improving estimation accuracy, with less research devoted to computational efficiency and hardware implementation. This paper proposes a light-weight multi-head self-attention neural network with only 298 trainable parameters and 1.18K floating point operations (FLOPs), while achieving 2.10% RMSE at all conditions. This proposed self-attention and simple Recurrent Neural Networks (RNN) can operate independently in the battery model, and can be greatly accelerated by on-chip parallel execution. Furthermore, a transfer learning (TL) method was applied to the proposed model. The proposed model with transfer learning performs significantly better on small training subsets and stopped with fewer epochs using an early stopping strategy. To the best of our knowledge, it is the first model using Transformer and self-attention for SOC estimation.