[j] .大GAN烘烤,时间序列合成生成对抗网络体系结构的广泛系统评估

J. Syst. Res. Pub Date : 2022-03-05 DOI:10.5070/sr32159045
Mark Leznik, Arne Lochner, S. Wesner, Jörg Domaschka
{"title":"[j] .大GAN烘烤,时间序列合成生成对抗网络体系结构的广泛系统评估","authors":"Mark Leznik, Arne Lochner, S. Wesner, Jörg Domaschka","doi":"10.5070/sr32159045","DOIUrl":null,"url":null,"abstract":"There is no standard approach to compare the success of different neural network architectures utilized for time series synthesis. This hinders the evaluation and decision, which architecture should be leveraged for an unknown data set. We propose a combination of metrics, which empirically evaluate the performance of neural network architectures trained for time series synthesis. With these measurements we are able to account for temporal correlations, spatial correlations and mode collapse issues within the generated time series. We further investigate the interaction of different generator and discriminator architectures between each other. The considered architectures include recurrent neural networks, temporal convolutional networks and transformer-based networks. So far, the application of transformer-based models is limited for time series synthesis. Hence, we propose a new transformer-based architecture, which is able to synthesise time series. We evaluate the proposed architectures and their comobinations in over 500 experiments, amounting to over 2500 computing hours. We provide results for four data data sets, one univariate and three multivariate. The data sets vary with regard to length, patterns in temporal and spatial correlations. We use our metrics to compare the performance of generative adversarial network architectures for time series synthesis. To verify our findings we utilize quantitative and qualitative evaluations. Our results indicate that temporal convolutional networks outperform recurrent neural network and transformer based approaches with regard to fidelity and flexibility of the generated data. Temporal convolutional network architecture are the most stable architecture for a mode collapse prone data set. The performance of the transformer models strongly depends on the data set characteristics, it struggled to synthesise data sets with high temporal and spatial correlations. Discriminators with recurrent network architectures suffered immensely from vanishing gradients. We also show, that the performance of the generative adversarial networks depends more on the discriminator part rather than the generator part.","PeriodicalId":363427,"journal":{"name":"J. Syst. Res.","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"[SoK] The Great GAN Bake Off, An Extensive Systematic Evaluation of Generative Adversarial Network Architectures for Time Series Synthesis\",\"authors\":\"Mark Leznik, Arne Lochner, S. Wesner, Jörg Domaschka\",\"doi\":\"10.5070/sr32159045\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"There is no standard approach to compare the success of different neural network architectures utilized for time series synthesis. This hinders the evaluation and decision, which architecture should be leveraged for an unknown data set. We propose a combination of metrics, which empirically evaluate the performance of neural network architectures trained for time series synthesis. With these measurements we are able to account for temporal correlations, spatial correlations and mode collapse issues within the generated time series. We further investigate the interaction of different generator and discriminator architectures between each other. The considered architectures include recurrent neural networks, temporal convolutional networks and transformer-based networks. So far, the application of transformer-based models is limited for time series synthesis. Hence, we propose a new transformer-based architecture, which is able to synthesise time series. We evaluate the proposed architectures and their comobinations in over 500 experiments, amounting to over 2500 computing hours. We provide results for four data data sets, one univariate and three multivariate. The data sets vary with regard to length, patterns in temporal and spatial correlations. We use our metrics to compare the performance of generative adversarial network architectures for time series synthesis. To verify our findings we utilize quantitative and qualitative evaluations. Our results indicate that temporal convolutional networks outperform recurrent neural network and transformer based approaches with regard to fidelity and flexibility of the generated data. Temporal convolutional network architecture are the most stable architecture for a mode collapse prone data set. The performance of the transformer models strongly depends on the data set characteristics, it struggled to synthesise data sets with high temporal and spatial correlations. Discriminators with recurrent network architectures suffered immensely from vanishing gradients. We also show, that the performance of the generative adversarial networks depends more on the discriminator part rather than the generator part.\",\"PeriodicalId\":363427,\"journal\":{\"name\":\"J. Syst. Res.\",\"volume\":\"7 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-03-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"J. Syst. Res.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.5070/sr32159045\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"J. Syst. Res.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5070/sr32159045","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

没有标准的方法来比较用于时间序列合成的不同神经网络架构的成功。这阻碍了评估和决策,即应该利用哪种体系结构来处理未知数据集。我们提出了一种度量的组合,它经验地评估了训练用于时间序列合成的神经网络架构的性能。通过这些测量,我们能够在生成的时间序列中解释时间相关性、空间相关性和模态崩溃问题。我们进一步研究了不同的生成器和鉴别器体系结构之间的相互作用。考虑的架构包括循环神经网络,时间卷积网络和基于变压器的网络。到目前为止,基于变压器的模型在时间序列综合中的应用受到限制。因此,我们提出了一种新的基于变压器的体系结构,它能够综合时间序列。我们在超过500个实验中评估了所提出的架构及其组合,总计超过2500个计算小时。我们提供了四个数据数据集的结果,一个单变量和三个多变量。数据集在长度、时间和空间相关性模式方面有所不同。我们使用我们的指标来比较时间序列合成的生成对抗网络架构的性能。为了验证我们的发现,我们使用定量和定性评估。我们的研究结果表明,就生成数据的保真度和灵活性而言,时间卷积网络优于循环神经网络和基于变压器的方法。对于易发生模态崩溃的数据集,时间卷积网络结构是最稳定的结构。变压器模型的性能在很大程度上取决于数据集的特征,它很难综合具有高时空相关性的数据集。具有循环网络结构的鉴别器受到梯度消失的极大影响。我们还表明,生成对抗网络的性能更多地取决于鉴别器部分而不是生成器部分。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
[SoK] The Great GAN Bake Off, An Extensive Systematic Evaluation of Generative Adversarial Network Architectures for Time Series Synthesis
There is no standard approach to compare the success of different neural network architectures utilized for time series synthesis. This hinders the evaluation and decision, which architecture should be leveraged for an unknown data set. We propose a combination of metrics, which empirically evaluate the performance of neural network architectures trained for time series synthesis. With these measurements we are able to account for temporal correlations, spatial correlations and mode collapse issues within the generated time series. We further investigate the interaction of different generator and discriminator architectures between each other. The considered architectures include recurrent neural networks, temporal convolutional networks and transformer-based networks. So far, the application of transformer-based models is limited for time series synthesis. Hence, we propose a new transformer-based architecture, which is able to synthesise time series. We evaluate the proposed architectures and their comobinations in over 500 experiments, amounting to over 2500 computing hours. We provide results for four data data sets, one univariate and three multivariate. The data sets vary with regard to length, patterns in temporal and spatial correlations. We use our metrics to compare the performance of generative adversarial network architectures for time series synthesis. To verify our findings we utilize quantitative and qualitative evaluations. Our results indicate that temporal convolutional networks outperform recurrent neural network and transformer based approaches with regard to fidelity and flexibility of the generated data. Temporal convolutional network architecture are the most stable architecture for a mode collapse prone data set. The performance of the transformer models strongly depends on the data set characteristics, it struggled to synthesise data sets with high temporal and spatial correlations. Discriminators with recurrent network architectures suffered immensely from vanishing gradients. We also show, that the performance of the generative adversarial networks depends more on the discriminator part rather than the generator part.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信