{"title":"基于散度的云系统服务需求估计方法","authors":"Salvatore Dipietro, G. Casale","doi":"10.1109/FiCloud.2019.00035","DOIUrl":null,"url":null,"abstract":"Estimating performance models parameters of cloud systems presents several challenges due to the distributed nature of the applications, the chains of interactions of requests with architectural nodes, and the parallelism and coordination mechanisms implemented within these systems. In this work, we present a new inference algorithm for model parameters, called state divergence (SD) algorithm, to accurately estimate resource demands in a complex cloud application. Differently from existing approaches, SD attempts to minimize the divergence between observed and modeled marginal state probabilities for individual nodes within an application, therefore requiring the availability of probabilistic measures from both the system and the underpinning model. Validation against a case study using the Apache Cassandra NoSQL database and random experiments show that SD can accurately predict demands and improve system behavior modeling and prediction.","PeriodicalId":268882,"journal":{"name":"2019 7th International Conference on Future Internet of Things and Cloud (FiCloud)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2019-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SD: A Divergence-Based Estimation Method for Service Demands in Cloud Systems\",\"authors\":\"Salvatore Dipietro, G. Casale\",\"doi\":\"10.1109/FiCloud.2019.00035\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Estimating performance models parameters of cloud systems presents several challenges due to the distributed nature of the applications, the chains of interactions of requests with architectural nodes, and the parallelism and coordination mechanisms implemented within these systems. In this work, we present a new inference algorithm for model parameters, called state divergence (SD) algorithm, to accurately estimate resource demands in a complex cloud application. Differently from existing approaches, SD attempts to minimize the divergence between observed and modeled marginal state probabilities for individual nodes within an application, therefore requiring the availability of probabilistic measures from both the system and the underpinning model. Validation against a case study using the Apache Cassandra NoSQL database and random experiments show that SD can accurately predict demands and improve system behavior modeling and prediction.\",\"PeriodicalId\":268882,\"journal\":{\"name\":\"2019 7th International Conference on Future Internet of Things and Cloud (FiCloud)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 7th International Conference on Future Internet of Things and Cloud (FiCloud)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/FiCloud.2019.00035\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 7th International Conference on Future Internet of Things and Cloud (FiCloud)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FiCloud.2019.00035","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
SD: A Divergence-Based Estimation Method for Service Demands in Cloud Systems
Estimating performance models parameters of cloud systems presents several challenges due to the distributed nature of the applications, the chains of interactions of requests with architectural nodes, and the parallelism and coordination mechanisms implemented within these systems. In this work, we present a new inference algorithm for model parameters, called state divergence (SD) algorithm, to accurately estimate resource demands in a complex cloud application. Differently from existing approaches, SD attempts to minimize the divergence between observed and modeled marginal state probabilities for individual nodes within an application, therefore requiring the availability of probabilistic measures from both the system and the underpinning model. Validation against a case study using the Apache Cassandra NoSQL database and random experiments show that SD can accurately predict demands and improve system behavior modeling and prediction.