{"title":"当小波分解遇到外部关注:轻量级云服务器负载预测模型","authors":"Zhen Zhang, Chen Xu, Jinyu Zhang, Zhe Zhu, Shaohua Xu","doi":"10.1186/s13677-024-00698-6","DOIUrl":null,"url":null,"abstract":"Load prediction tasks aim to predict the dynamic trend of future load based on historical performance sequences, which are crucial for cloud platforms to make timely and reasonable task scheduling. However, existing prediction models are limited while capturing complicated temporal patterns from the load sequences. Besides, the frequently adopted global weighting strategy (e.g., the self-attention mechanism) in temporal modeling schemes has quadratic computational complexity, hindering the immediate response of cloud servers in complex real-time scenarios. To address the above limitations, we propose a Wavelet decomposition-enhanced External Transformer (WETformer) to provide accurate yet efficient load prediction for cloud servers. Specifically, we first incorporate discrete wavelet transform to progressively extract long-term trends, highlighting the intrinsic attributes of temporal sequences. Then, we propose a lightweight multi-head External Attention (EA) mechanism to simultaneously consider the inter-element relationships within load sequences and the correlations across different sequences. Such an external component has linear computational complexity, mitigating the encoding redundancy prevalent and enhancing prediction efficiency. Extensive experiments conducted on Alibaba Cloud’s cluster tracking dataset demonstrate that WETformer achieves superior prediction accuracy and the shortest inference time compared to several state-of-the-art baseline methods.","PeriodicalId":501257,"journal":{"name":"Journal of Cloud Computing","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"When wavelet decomposition meets external attention: a lightweight cloud server load prediction model\",\"authors\":\"Zhen Zhang, Chen Xu, Jinyu Zhang, Zhe Zhu, Shaohua Xu\",\"doi\":\"10.1186/s13677-024-00698-6\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Load prediction tasks aim to predict the dynamic trend of future load based on historical performance sequences, which are crucial for cloud platforms to make timely and reasonable task scheduling. However, existing prediction models are limited while capturing complicated temporal patterns from the load sequences. Besides, the frequently adopted global weighting strategy (e.g., the self-attention mechanism) in temporal modeling schemes has quadratic computational complexity, hindering the immediate response of cloud servers in complex real-time scenarios. To address the above limitations, we propose a Wavelet decomposition-enhanced External Transformer (WETformer) to provide accurate yet efficient load prediction for cloud servers. Specifically, we first incorporate discrete wavelet transform to progressively extract long-term trends, highlighting the intrinsic attributes of temporal sequences. Then, we propose a lightweight multi-head External Attention (EA) mechanism to simultaneously consider the inter-element relationships within load sequences and the correlations across different sequences. Such an external component has linear computational complexity, mitigating the encoding redundancy prevalent and enhancing prediction efficiency. Extensive experiments conducted on Alibaba Cloud’s cluster tracking dataset demonstrate that WETformer achieves superior prediction accuracy and the shortest inference time compared to several state-of-the-art baseline methods.\",\"PeriodicalId\":501257,\"journal\":{\"name\":\"Journal of Cloud Computing\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Cloud Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1186/s13677-024-00698-6\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Cloud Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1186/s13677-024-00698-6","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
When wavelet decomposition meets external attention: a lightweight cloud server load prediction model
Load prediction tasks aim to predict the dynamic trend of future load based on historical performance sequences, which are crucial for cloud platforms to make timely and reasonable task scheduling. However, existing prediction models are limited while capturing complicated temporal patterns from the load sequences. Besides, the frequently adopted global weighting strategy (e.g., the self-attention mechanism) in temporal modeling schemes has quadratic computational complexity, hindering the immediate response of cloud servers in complex real-time scenarios. To address the above limitations, we propose a Wavelet decomposition-enhanced External Transformer (WETformer) to provide accurate yet efficient load prediction for cloud servers. Specifically, we first incorporate discrete wavelet transform to progressively extract long-term trends, highlighting the intrinsic attributes of temporal sequences. Then, we propose a lightweight multi-head External Attention (EA) mechanism to simultaneously consider the inter-element relationships within load sequences and the correlations across different sequences. Such an external component has linear computational complexity, mitigating the encoding redundancy prevalent and enhancing prediction efficiency. Extensive experiments conducted on Alibaba Cloud’s cluster tracking dataset demonstrate that WETformer achieves superior prediction accuracy and the shortest inference time compared to several state-of-the-art baseline methods.