一个可转移的轻型多头自关注神经网络模型,用于锂离子电池的充电状态估计

Ruizhi Hun, Song Zhang, Gurjeet Singh, Jian Qian, Yun Chen, Patrick Chiang
{"title":"一个可转移的轻型多头自关注神经网络模型,用于锂离子电池的充电状态估计","authors":"Ruizhi Hun, Song Zhang, Gurjeet Singh, Jian Qian, Yun Chen, Patrick Chiang","doi":"10.1109/SPIES52282.2021.9633780","DOIUrl":null,"url":null,"abstract":"Deep learning has drawn wide attention in state-of-charge (SOC) estimation for batteries in electric vehicles, owning to its model-free feature and powerful ability of nonlinear regression. However, previous studies focused on improving estimation accuracy, with less research devoted to computational efficiency and hardware implementation. This paper proposes a light-weight multi-head self-attention neural network with only 298 trainable parameters and 1.18K floating point operations (FLOPs), while achieving 2.10% RMSE at all conditions. This proposed self-attention and simple Recurrent Neural Networks (RNN) can operate independently in the battery model, and can be greatly accelerated by on-chip parallel execution. Furthermore, a transfer learning (TL) method was applied to the proposed model. The proposed model with transfer learning performs significantly better on small training subsets and stopped with fewer epochs using an early stopping strategy. To the best of our knowledge, it is the first model using Transformer and self-attention for SOC estimation.","PeriodicalId":411512,"journal":{"name":"2021 3rd International Conference on Smart Power & Internet Energy Systems (SPIES)","volume":"262 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"LISA: A Transferable Light-weight Multi-Head Self-Attention Neural Network Model for Lithium-Ion Batteries State-of-Charge Estimation\",\"authors\":\"Ruizhi Hun, Song Zhang, Gurjeet Singh, Jian Qian, Yun Chen, Patrick Chiang\",\"doi\":\"10.1109/SPIES52282.2021.9633780\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deep learning has drawn wide attention in state-of-charge (SOC) estimation for batteries in electric vehicles, owning to its model-free feature and powerful ability of nonlinear regression. However, previous studies focused on improving estimation accuracy, with less research devoted to computational efficiency and hardware implementation. This paper proposes a light-weight multi-head self-attention neural network with only 298 trainable parameters and 1.18K floating point operations (FLOPs), while achieving 2.10% RMSE at all conditions. This proposed self-attention and simple Recurrent Neural Networks (RNN) can operate independently in the battery model, and can be greatly accelerated by on-chip parallel execution. Furthermore, a transfer learning (TL) method was applied to the proposed model. The proposed model with transfer learning performs significantly better on small training subsets and stopped with fewer epochs using an early stopping strategy. To the best of our knowledge, it is the first model using Transformer and self-attention for SOC estimation.\",\"PeriodicalId\":411512,\"journal\":{\"name\":\"2021 3rd International Conference on Smart Power & Internet Energy Systems (SPIES)\",\"volume\":\"262 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-09-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 3rd International Conference on Smart Power & Internet Energy Systems (SPIES)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SPIES52282.2021.9633780\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 3rd International Conference on Smart Power & Internet Energy Systems (SPIES)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SPIES52282.2021.9633780","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

深度学习以其无模型特性和强大的非线性回归能力在电动汽车电池荷电状态(SOC)估计中受到广泛关注。然而,以往的研究主要集中在提高估计精度上,而对计算效率和硬件实现的研究较少。本文提出了一种轻量级多头自注意神经网络,该网络只有298个可训练参数和1.18K浮点运算(FLOPs),在所有条件下均能达到2.10%的RMSE。这种自关注和简单的递归神经网络(RNN)可以在电池模型中独立运行,并且可以通过片上并行执行大大加速。此外,将迁移学习(TL)方法应用于该模型。提出的迁移学习模型在小的训练子集上表现明显更好,并且使用早期停止策略以更少的epoch停止。据我们所知,这是第一个使用Transformer和自关注进行SOC估计的模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
LISA: A Transferable Light-weight Multi-Head Self-Attention Neural Network Model for Lithium-Ion Batteries State-of-Charge Estimation
Deep learning has drawn wide attention in state-of-charge (SOC) estimation for batteries in electric vehicles, owning to its model-free feature and powerful ability of nonlinear regression. However, previous studies focused on improving estimation accuracy, with less research devoted to computational efficiency and hardware implementation. This paper proposes a light-weight multi-head self-attention neural network with only 298 trainable parameters and 1.18K floating point operations (FLOPs), while achieving 2.10% RMSE at all conditions. This proposed self-attention and simple Recurrent Neural Networks (RNN) can operate independently in the battery model, and can be greatly accelerated by on-chip parallel execution. Furthermore, a transfer learning (TL) method was applied to the proposed model. The proposed model with transfer learning performs significantly better on small training subsets and stopped with fewer epochs using an early stopping strategy. To the best of our knowledge, it is the first model using Transformer and self-attention for SOC estimation.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信