Olaf Elzinga, Spiros Koulouzis, Arie Taal, Junchao Wang, Yang Hu, Huan Zhou, Paul Martin, C. D. Laat, Zhiming Zhao
{"title":"Automatic Collector for Dynamic Cloud Performance Information","authors":"Olaf Elzinga, Spiros Koulouzis, Arie Taal, Junchao Wang, Yang Hu, Huan Zhou, Paul Martin, C. D. Laat, Zhiming Zhao","doi":"10.1109/NAS.2017.8026845","DOIUrl":null,"url":null,"abstract":"When deploying an application in the cloud, a developer often wants to know which of the wide variety of cloud resources is best to use. Most cloud providers only provide static information about different cloud resources which is often not enough because static information does not take into account the hardware and software that is being used or the policy that has been applied by the cloud provider. Therefore, dynamic benchmarking of cloud resources is needed to find out how a certain workload is going to behave on a certain instance. However, benchmarking various cloud resources is a time consuming process. Thus, using a tool which automatically benchmarks various cloud resources will be of great use. % To maximize the effectiveness of such a tool, it will be helpful to maintain an up to date cloud information catalogue, so that users can share and compare their benchmark results to the results of other users. In this paper, we present the Cloud Performance Collector, a modular cloud benchmarking tool aimed to automatically benchmark a wide variety of applications. To demonstrate the benefit of the tool, we did three experiments with three synthetic benchmark applications and one real-world application using the ExoGENI testbed.","PeriodicalId":222161,"journal":{"name":"2017 International Conference on Networking, Architecture, and Storage (NAS)","volume":"72 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 International Conference on Networking, Architecture, and Storage (NAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NAS.2017.8026845","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
When deploying an application in the cloud, a developer often wants to know which of the wide variety of cloud resources is best to use. Most cloud providers only provide static information about different cloud resources which is often not enough because static information does not take into account the hardware and software that is being used or the policy that has been applied by the cloud provider. Therefore, dynamic benchmarking of cloud resources is needed to find out how a certain workload is going to behave on a certain instance. However, benchmarking various cloud resources is a time consuming process. Thus, using a tool which automatically benchmarks various cloud resources will be of great use. % To maximize the effectiveness of such a tool, it will be helpful to maintain an up to date cloud information catalogue, so that users can share and compare their benchmark results to the results of other users. In this paper, we present the Cloud Performance Collector, a modular cloud benchmarking tool aimed to automatically benchmark a wide variety of applications. To demonstrate the benefit of the tool, we did three experiments with three synthetic benchmark applications and one real-world application using the ExoGENI testbed.