FoldFormer: sequence folding and seasonal attention for fine-grained long-term FaaS forecasting

L. N. Darlow, Artjom Joosen, Martin Asenov, Qiwen Deng, Jianfeng Wang, Adam Barker
{"title":"FoldFormer: sequence folding and seasonal attention for fine-grained long-term FaaS forecasting","authors":"L. N. Darlow, Artjom Joosen, Martin Asenov, Qiwen Deng, Jianfeng Wang, Adam Barker","doi":"10.1145/3578356.3592582","DOIUrl":null,"url":null,"abstract":"Fine-grained long-term (FGLT) time series forecasting is a fundamental challenge in Function as a Service (FaaS) platforms. The data that FaaS function requests produce are fine-grained (per-second/minute), often have daily periodicity, and are persistent over the long term. Forecasting in the FGLT data regime is challenging, and Transformer models can scale poorly for long sequences. We propose FoldFormer that combines several novel elements - time-to-latent folding, seasonal attention, and convolutions over FFT representations - as a new solution for FGLT forecasting of FaaS function requests. FoldFormer is designed to efficiently consume very fine-grained multi-day data with nearly no additional model, memory, or compute overhead, when compared to consuming coarse-grained data. We show either state-of-the-art or competitive performance for per-minute function requests on the top 5 most requested functions for three data sources, including two in-house Huawei Cloud sources and Azure 2019. We also show state-of-the-art performance at per-second granularity --- a regime that critically limits most other methods.","PeriodicalId":370204,"journal":{"name":"Proceedings of the 3rd Workshop on Machine Learning and Systems","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 3rd Workshop on Machine Learning and Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3578356.3592582","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Fine-grained long-term (FGLT) time series forecasting is a fundamental challenge in Function as a Service (FaaS) platforms. The data that FaaS function requests produce are fine-grained (per-second/minute), often have daily periodicity, and are persistent over the long term. Forecasting in the FGLT data regime is challenging, and Transformer models can scale poorly for long sequences. We propose FoldFormer that combines several novel elements - time-to-latent folding, seasonal attention, and convolutions over FFT representations - as a new solution for FGLT forecasting of FaaS function requests. FoldFormer is designed to efficiently consume very fine-grained multi-day data with nearly no additional model, memory, or compute overhead, when compared to consuming coarse-grained data. We show either state-of-the-art or competitive performance for per-minute function requests on the top 5 most requested functions for three data sources, including two in-house Huawei Cloud sources and Azure 2019. We also show state-of-the-art performance at per-second granularity --- a regime that critically limits most other methods.
FoldFormer:用于细粒度长期FaaS预测的序列折叠和季节性关注
细粒度长期(FGLT)时间序列预测是功能即服务(FaaS)平台的一个基本挑战。FaaS功能请求生成的数据是细粒度的(每秒/分钟),通常具有每日周期性,并且在长期内是持久的。在FGLT数据体系中进行预测是具有挑战性的,而且Transformer模型对于长序列的可扩展性很差。我们提出了FoldFormer,它结合了几个新元素-时间到潜在折叠,季节关注和FFT表示上的卷积-作为FaaS功能请求的FGLT预测的新解决方案。与使用粗粒度数据相比,FoldFormer旨在有效地使用非常细粒度的多天数据,几乎没有额外的模型、内存或计算开销。我们展示了三个数据源(包括两个内部华为云资源和Azure 2019)的前5个最受欢迎的功能的每分钟功能请求的最先进或具有竞争力的性能。我们还展示了每秒粒度的最先进性能——这一机制严重限制了大多数其他方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信