{"title":"AS-ODB: Multivariate Attention Supervised Learning Based Optimized DBN Approach for Cloud Workload Prediction","authors":"G. M. Kiran, A. Aparna Rajesh, D. Basavesha","doi":"10.3103/S1060992X25700122","DOIUrl":null,"url":null,"abstract":"<p>Attainable on demand cloud computing makes it feasible to access a centralized shared pool of computing resources. Accurate estimation of cloud workload is necessary for optimal performance and effective use of cloud computing resources. Because cloud workloads are dynamic and unpredictable, this is a problematic problem. In this case, deep learning can provide reliable foundations for workload prediction in data centres when trained appropriately. In the proposed model, efficient workload prediction is executed out using novel deep learning. Efficient management of these hyperparameters may significantly improve the neural network model’s performance. Using the data centre’s workload traces at many consecutive time steps, the suggested approach is shown to be able to estimate Central Processing Unit (CPU) utilization. Collects raw data retrieved from the storage, including the number and type of requests, virtual machine (VMs) costs, and resource usage. Discover patterns and oscillations in the workload trace by preprocessing the data to increase the prediction efficacy of this model. During data pre-processing, the KCR approach, min max normalization, and data cleaning are used to select the important properties from raw data samples, eliminate noise, and normalize them. After that, a sliding window is used for deep learning processing to convert multivariate data into time series with supervised learning. Next, utilize a deep belief network based on green anaconda optimization (GrA-DBN) to attain precise workload forecasting. Comparing the suggested methodology with existing models, experimental results show that it provides a better trade-off between accuracy and training time. The suggested method provides higher performance, with an execution time of 28.5 s and an accuracy rate of 93.60%. According to the simulation results, the GrA-DBN workload prediction method performs better than other algorithms.</p>","PeriodicalId":721,"journal":{"name":"Optical Memory and Neural Networks","volume":"34 3","pages":"389 - 401"},"PeriodicalIF":0.8000,"publicationDate":"2025-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optical Memory and Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://link.springer.com/article/10.3103/S1060992X25700122","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"OPTICS","Score":null,"Total":0}
引用次数: 0
Abstract
Attainable on demand cloud computing makes it feasible to access a centralized shared pool of computing resources. Accurate estimation of cloud workload is necessary for optimal performance and effective use of cloud computing resources. Because cloud workloads are dynamic and unpredictable, this is a problematic problem. In this case, deep learning can provide reliable foundations for workload prediction in data centres when trained appropriately. In the proposed model, efficient workload prediction is executed out using novel deep learning. Efficient management of these hyperparameters may significantly improve the neural network model’s performance. Using the data centre’s workload traces at many consecutive time steps, the suggested approach is shown to be able to estimate Central Processing Unit (CPU) utilization. Collects raw data retrieved from the storage, including the number and type of requests, virtual machine (VMs) costs, and resource usage. Discover patterns and oscillations in the workload trace by preprocessing the data to increase the prediction efficacy of this model. During data pre-processing, the KCR approach, min max normalization, and data cleaning are used to select the important properties from raw data samples, eliminate noise, and normalize them. After that, a sliding window is used for deep learning processing to convert multivariate data into time series with supervised learning. Next, utilize a deep belief network based on green anaconda optimization (GrA-DBN) to attain precise workload forecasting. Comparing the suggested methodology with existing models, experimental results show that it provides a better trade-off between accuracy and training time. The suggested method provides higher performance, with an execution time of 28.5 s and an accuracy rate of 93.60%. According to the simulation results, the GrA-DBN workload prediction method performs better than other algorithms.
期刊介绍:
The journal covers a wide range of issues in information optics such as optical memory, mechanisms for optical data recording and processing, photosensitive materials, optical, optoelectronic and holographic nanostructures, and many other related topics. Papers on memory systems using holographic and biological structures and concepts of brain operation are also included. The journal pays particular attention to research in the field of neural net systems that may lead to a new generation of computional technologies by endowing them with intelligence.