Bidirectional Attention LSTM Networks for Non-instructive Load Monitoring

Yuwei Fan, Chao Liu, Tengbo Guo, D. Jiang
{"title":"Bidirectional Attention LSTM Networks for Non-instructive Load Monitoring","authors":"Yuwei Fan, Chao Liu, Tengbo Guo, D. Jiang","doi":"10.1109/PHM2022-London52454.2022.00076","DOIUrl":null,"url":null,"abstract":"Non-instructive load monitoring (NILM) is a data processing method that decomposes the total energy consumption and estimates the power of individual electrical appliances. The application of NILM can provide additional information for optimal control strategy of smart grid, to achieve the purpose of saving energy by fine management. However, the accuracy of traditional NILM methods doesn’t have high accuracy of decomposed power value. In this work, we apply long short-term memory (LSTM) and achieve good accuracy by enhancing the LSTM model with bidirectional and attention mechanisms, as well as kernel density estimation. The model first normalizes the total energy consumption and converts the normalized data to time series of fixed length. LSTM extracts features from the time series, with the bidirectional mechanism to operate from both normal and reverse order and the attention mechanism to calculate the attention weights of different time steps. Besides, kernel density estimation is used to fit the training data and modify the output of the deep learning model, which upgrades the disaggregation accuracy. The proposed model is tested on UK-dale dataset.","PeriodicalId":269605,"journal":{"name":"2022 Prognostics and Health Management Conference (PHM-2022 London)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 Prognostics and Health Management Conference (PHM-2022 London)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PHM2022-London52454.2022.00076","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Non-instructive load monitoring (NILM) is a data processing method that decomposes the total energy consumption and estimates the power of individual electrical appliances. The application of NILM can provide additional information for optimal control strategy of smart grid, to achieve the purpose of saving energy by fine management. However, the accuracy of traditional NILM methods doesn’t have high accuracy of decomposed power value. In this work, we apply long short-term memory (LSTM) and achieve good accuracy by enhancing the LSTM model with bidirectional and attention mechanisms, as well as kernel density estimation. The model first normalizes the total energy consumption and converts the normalized data to time series of fixed length. LSTM extracts features from the time series, with the bidirectional mechanism to operate from both normal and reverse order and the attention mechanism to calculate the attention weights of different time steps. Besides, kernel density estimation is used to fit the training data and modify the output of the deep learning model, which upgrades the disaggregation accuracy. The proposed model is tested on UK-dale dataset.
用于非指导性负荷监测的双向关注LSTM网络
非指导性负荷监测(NILM)是一种分解总能耗,估算单个电器功率的数据处理方法。NILM的应用可以为智能电网的最优控制策略提供附加信息,达到精细化管理节能的目的。然而,传统的NILM方法对功率分解值的精度不高。在这项工作中,我们应用了长短期记忆(LSTM),并通过双向和注意机制以及核密度估计来增强LSTM模型,从而获得了良好的准确性。该模型首先对总能耗进行归一化,并将归一化后的数据转换为固定长度的时间序列。LSTM从时间序列中提取特征,采用正反两种顺序的双向机制和计算不同时间步长的注意权值的注意机制。此外,利用核密度估计对训练数据进行拟合,并对深度学习模型的输出进行修正,提高了解聚精度。在UK-dale数据集上对该模型进行了测试。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信