Yogesh D. Barve, Shashank Shekhar, S. Khare, Anirban Bhattacharjee, A. Gokhale
{"title":"UPSARA:用于云托管应用程序性能分析的模型驱动方法","authors":"Yogesh D. Barve, Shashank Shekhar, S. Khare, Anirban Bhattacharjee, A. Gokhale","doi":"10.1109/UCC.2018.00009","DOIUrl":null,"url":null,"abstract":"Accurately analyzing the sources of performance anomalies in cloud-based applications is a hard problem due both to the multi tenant nature of cloud deployment and changing application workloads. To that end many different resource instrumentation and application performance modeling frameworks have been developed in recent years to help in the effective deployment and resource management decisions. Yet, the significant differences among these frameworks in terms of their APIs, their ability to instrument resources at different levels of granularity, and making sense of the collected information make it extremely hard to effectively use these frameworks. Not addressing these complexities can result in operators providing incompatible and incorrect configurations leading to inaccurate diagnosis of performance issues and hence incorrect resource management. To address these challenges, we present UPSARA, a model-driven generative framework that provides an extensible, lightweight and scalable performance monitoring, analysis and testing framework for cloud-hosted applications. UPSARA helps alleviate the accidental complexities in configuring the right resource monitoring and performance testing strategies for the underlying instrumentation frameworks used. We evaluate the effectiveness of UPSARA in the context of representative use cases highlighting its features and benefits.","PeriodicalId":288232,"journal":{"name":"2018 IEEE/ACM 11th International Conference on Utility and Cloud Computing (UCC)","volume":"2014 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"UPSARA: A Model-Driven Approach for Performance Analysis of Cloud-Hosted Applications\",\"authors\":\"Yogesh D. Barve, Shashank Shekhar, S. Khare, Anirban Bhattacharjee, A. Gokhale\",\"doi\":\"10.1109/UCC.2018.00009\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Accurately analyzing the sources of performance anomalies in cloud-based applications is a hard problem due both to the multi tenant nature of cloud deployment and changing application workloads. To that end many different resource instrumentation and application performance modeling frameworks have been developed in recent years to help in the effective deployment and resource management decisions. Yet, the significant differences among these frameworks in terms of their APIs, their ability to instrument resources at different levels of granularity, and making sense of the collected information make it extremely hard to effectively use these frameworks. Not addressing these complexities can result in operators providing incompatible and incorrect configurations leading to inaccurate diagnosis of performance issues and hence incorrect resource management. To address these challenges, we present UPSARA, a model-driven generative framework that provides an extensible, lightweight and scalable performance monitoring, analysis and testing framework for cloud-hosted applications. UPSARA helps alleviate the accidental complexities in configuring the right resource monitoring and performance testing strategies for the underlying instrumentation frameworks used. We evaluate the effectiveness of UPSARA in the context of representative use cases highlighting its features and benefits.\",\"PeriodicalId\":288232,\"journal\":{\"name\":\"2018 IEEE/ACM 11th International Conference on Utility and Cloud Computing (UCC)\",\"volume\":\"2014 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE/ACM 11th International Conference on Utility and Cloud Computing (UCC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/UCC.2018.00009\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE/ACM 11th International Conference on Utility and Cloud Computing (UCC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/UCC.2018.00009","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
UPSARA: A Model-Driven Approach for Performance Analysis of Cloud-Hosted Applications
Accurately analyzing the sources of performance anomalies in cloud-based applications is a hard problem due both to the multi tenant nature of cloud deployment and changing application workloads. To that end many different resource instrumentation and application performance modeling frameworks have been developed in recent years to help in the effective deployment and resource management decisions. Yet, the significant differences among these frameworks in terms of their APIs, their ability to instrument resources at different levels of granularity, and making sense of the collected information make it extremely hard to effectively use these frameworks. Not addressing these complexities can result in operators providing incompatible and incorrect configurations leading to inaccurate diagnosis of performance issues and hence incorrect resource management. To address these challenges, we present UPSARA, a model-driven generative framework that provides an extensible, lightweight and scalable performance monitoring, analysis and testing framework for cloud-hosted applications. UPSARA helps alleviate the accidental complexities in configuring the right resource monitoring and performance testing strategies for the underlying instrumentation frameworks used. We evaluate the effectiveness of UPSARA in the context of representative use cases highlighting its features and benefits.