{"title":"Towards Automating Representative Load Testing in Continuous Software Engineering","authors":"Henning Schulz, Tobias Angerstein, A. Hoorn","doi":"10.1145/3185768.3186288","DOIUrl":null,"url":null,"abstract":"As an application's performance can significantly impact the user satisfaction and, consequently, the business success, companies need to test performance before delivery. Though load testing allows for testing the performance under representative load by simulating user behavior, it typically entails high maintenance and execution overhead, hindering application in practice. With regard to the trend of continuous software engineering with its parallel and frequently executed delivery pipelines, load testing is even harder to be applied. In this paper, we present our vision of automated, context-specific and low-overhead load testing in continuous software engineering. First, we strive for reducing the maintenance overhead by evolving manual adjustments to generated workload models over a changing environment. Early evaluation results show a seamless evolution over changing user behavior. Building on this, we intend to significantly reduce the execution time and required resources by introducing online-generated load tests that precisely address the relevant context and services under test. Finally, we investigate minimizing the amount of components to be deployed by utilizing load-test-capable performance stubs.","PeriodicalId":10596,"journal":{"name":"Companion of the 2018 ACM/SPEC International Conference on Performance Engineering","volume":"20 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2018-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Companion of the 2018 ACM/SPEC International Conference on Performance Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3185768.3186288","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 16
Abstract
As an application's performance can significantly impact the user satisfaction and, consequently, the business success, companies need to test performance before delivery. Though load testing allows for testing the performance under representative load by simulating user behavior, it typically entails high maintenance and execution overhead, hindering application in practice. With regard to the trend of continuous software engineering with its parallel and frequently executed delivery pipelines, load testing is even harder to be applied. In this paper, we present our vision of automated, context-specific and low-overhead load testing in continuous software engineering. First, we strive for reducing the maintenance overhead by evolving manual adjustments to generated workload models over a changing environment. Early evaluation results show a seamless evolution over changing user behavior. Building on this, we intend to significantly reduce the execution time and required resources by introducing online-generated load tests that precisely address the relevant context and services under test. Finally, we investigate minimizing the amount of components to be deployed by utilizing load-test-capable performance stubs.