{"title":"Extracting key temporal and cyclic features from VIT data to predict lithium-ion battery knee points using attention mechanisms","authors":"Jaewook Lee , Seongmin Heo , Jay H. Lee","doi":"10.1016/j.compchemeng.2024.108931","DOIUrl":null,"url":null,"abstract":"<div><div>Accurate prediction of lithium-ion battery lifespan is crucial for mitigating risks, as battery cycling experiments are time-consuming and costly. Despite this, few studies have effectively leveraged cycling data with minimal information loss and optimized input size. To bridge this gap, we propose three models that integrate attention layers into a foundational model. Temporal attention helps address the vanishing gradient problem inherent in recurrent neural networks, enabling a manageable input size for subsequent networks. Self-attention applied to context vectors, termed cyclic attention, allows models to efficiently identify key cycles that capture the majority of information across cycles, strategically reducing input size. By employing multi-head attention, required input size is reduced from 100 to 30 cycles, significant reduction than single-head approaches, as each head accentuates distinct key cycles from various perspectives. Our enhanced model shows a 39.6 % improvement in regression performance using only the first 30 cycles, significantly advancing our previous method.</div></div>","PeriodicalId":286,"journal":{"name":"Computers & Chemical Engineering","volume":"193 ","pages":"Article 108931"},"PeriodicalIF":3.9000,"publicationDate":"2024-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers & Chemical Engineering","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0098135424003491","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
Accurate prediction of lithium-ion battery lifespan is crucial for mitigating risks, as battery cycling experiments are time-consuming and costly. Despite this, few studies have effectively leveraged cycling data with minimal information loss and optimized input size. To bridge this gap, we propose three models that integrate attention layers into a foundational model. Temporal attention helps address the vanishing gradient problem inherent in recurrent neural networks, enabling a manageable input size for subsequent networks. Self-attention applied to context vectors, termed cyclic attention, allows models to efficiently identify key cycles that capture the majority of information across cycles, strategically reducing input size. By employing multi-head attention, required input size is reduced from 100 to 30 cycles, significant reduction than single-head approaches, as each head accentuates distinct key cycles from various perspectives. Our enhanced model shows a 39.6 % improvement in regression performance using only the first 30 cycles, significantly advancing our previous method.
期刊介绍:
Computers & Chemical Engineering is primarily a journal of record for new developments in the application of computing and systems technology to chemical engineering problems.