用噪声增强方法抑制时间序列数据的漂移

Tonya Fields, G. Hsieh, Jules Chenou
{"title":"用噪声增强方法抑制时间序列数据的漂移","authors":"Tonya Fields, G. Hsieh, Jules Chenou","doi":"10.1109/CSCI49370.2019.00046","DOIUrl":null,"url":null,"abstract":"Machine leaning (ML) models must be accurate to produce quality AI solutions. There must be high accuracy in the data and with the model that is built using the data. Online machine learning algorithms fits naturally with use cases that involves time series data. In online environments the data distribution can change over time producing what is known as concept drift. Real-life, real-time, machine learning algorithms operating in dynamic environments must be able to detect any drift or changes in the data distribution and adapt and update the ML model in the face of data that changes over time. In this paper we present the work of a simulated drift added to time series ML models. We simulate drift on Multiplayer perceptron (MLP), Long Short Term Memory (LSTM), Convolution Neural Networks (CNN) and Gated Recurrent Unit (GRU). Results show ML models with flavors of recurrent neural network (RNN) are less sensitive to drift compared to other models. By adding noise to the training set, we can recover accuracy of the model in the face of drift.","PeriodicalId":103662,"journal":{"name":"2019 International Conference on Computational Science and Computational Intelligence (CSCI)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"21","resultStr":"{\"title\":\"Mitigating Drift in Time Series Data with Noise Augmentation\",\"authors\":\"Tonya Fields, G. Hsieh, Jules Chenou\",\"doi\":\"10.1109/CSCI49370.2019.00046\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Machine leaning (ML) models must be accurate to produce quality AI solutions. There must be high accuracy in the data and with the model that is built using the data. Online machine learning algorithms fits naturally with use cases that involves time series data. In online environments the data distribution can change over time producing what is known as concept drift. Real-life, real-time, machine learning algorithms operating in dynamic environments must be able to detect any drift or changes in the data distribution and adapt and update the ML model in the face of data that changes over time. In this paper we present the work of a simulated drift added to time series ML models. We simulate drift on Multiplayer perceptron (MLP), Long Short Term Memory (LSTM), Convolution Neural Networks (CNN) and Gated Recurrent Unit (GRU). Results show ML models with flavors of recurrent neural network (RNN) are less sensitive to drift compared to other models. By adding noise to the training set, we can recover accuracy of the model in the face of drift.\",\"PeriodicalId\":103662,\"journal\":{\"name\":\"2019 International Conference on Computational Science and Computational Intelligence (CSCI)\",\"volume\":\"43 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"21\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 International Conference on Computational Science and Computational Intelligence (CSCI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CSCI49370.2019.00046\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Conference on Computational Science and Computational Intelligence (CSCI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CSCI49370.2019.00046","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 21

摘要

机器学习(ML)模型必须准确,才能产生高质量的人工智能解决方案。数据和使用数据构建的模型必须具有很高的准确性。在线机器学习算法自然适合涉及时间序列数据的用例。在在线环境中,数据分布可能随着时间的推移而改变,从而产生所谓的概念漂移。在动态环境中运行的现实生活、实时机器学习算法必须能够检测数据分布中的任何漂移或变化,并在面对随时间变化的数据时适应和更新ML模型。在本文中,我们提出了一个模拟漂移添加到时间序列ML模型的工作。我们在多人感知器(MLP)、长短期记忆(LSTM)、卷积神经网络(CNN)和门控循环单元(GRU)上模拟漂移。结果表明,与其他模型相比,带有递归神经网络(RNN)味道的ML模型对漂移的敏感性较低。通过在训练集中加入噪声,可以恢复模型在漂移情况下的精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Mitigating Drift in Time Series Data with Noise Augmentation
Machine leaning (ML) models must be accurate to produce quality AI solutions. There must be high accuracy in the data and with the model that is built using the data. Online machine learning algorithms fits naturally with use cases that involves time series data. In online environments the data distribution can change over time producing what is known as concept drift. Real-life, real-time, machine learning algorithms operating in dynamic environments must be able to detect any drift or changes in the data distribution and adapt and update the ML model in the face of data that changes over time. In this paper we present the work of a simulated drift added to time series ML models. We simulate drift on Multiplayer perceptron (MLP), Long Short Term Memory (LSTM), Convolution Neural Networks (CNN) and Gated Recurrent Unit (GRU). Results show ML models with flavors of recurrent neural network (RNN) are less sensitive to drift compared to other models. By adding noise to the training set, we can recover accuracy of the model in the face of drift.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信