{"title":"基于随机优化模型的云计算服务风险与能耗权衡","authors":"Jue Wang, Siqian Shen","doi":"10.1109/UCC.2012.37","DOIUrl":null,"url":null,"abstract":"Energy efficiency and computational reliability are two key concerns associated with modern computations that involve computational resource sharing and highly uncertain job arrivals from various sources (customers). In 2010, large-scale data center operations consume around 2% of the total energy use in the US. Meanwhile, the development of cloud computing in the IT industry possesses great potential for lowering energy consumption by partitioning and scheduling job requests among multiple computational servers. In this paper, we formulate stochastic integer programming models to minimize energy consumption of cloud computing servers over finite time periods, while maintaining a pre-specified quality of service (QoS) level for satisfying uncertain computational requests. The models dynamically monitor and predict customer requests for each period, and proactively switch servers on/off according to estimated customer requests. QoS levels are maintained by either enforcing zero unsatisfied requests, or imposing a joint chance constraint to bound possible failures in a backlogging model. When uncertain requests follow continuous distributions, we employ the Sampling Average Approximation for generating scenario-based requests. Such an approach transforms the original probabilistic model into deterministic mixed-integer linear programs. We further demonstrate computational results of all models by testing instances with different parameter combinations, and investigate how backlogging, unit penalty cost and QoS levels influence computational performances and optimal solutions.","PeriodicalId":122639,"journal":{"name":"2012 IEEE Fifth International Conference on Utility and Cloud Computing","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-11-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"18","resultStr":"{\"title\":\"Risk and Energy Consumption Tradeoffs in Cloud Computing Service via Stochastic Optimization Models\",\"authors\":\"Jue Wang, Siqian Shen\",\"doi\":\"10.1109/UCC.2012.37\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Energy efficiency and computational reliability are two key concerns associated with modern computations that involve computational resource sharing and highly uncertain job arrivals from various sources (customers). In 2010, large-scale data center operations consume around 2% of the total energy use in the US. Meanwhile, the development of cloud computing in the IT industry possesses great potential for lowering energy consumption by partitioning and scheduling job requests among multiple computational servers. In this paper, we formulate stochastic integer programming models to minimize energy consumption of cloud computing servers over finite time periods, while maintaining a pre-specified quality of service (QoS) level for satisfying uncertain computational requests. The models dynamically monitor and predict customer requests for each period, and proactively switch servers on/off according to estimated customer requests. QoS levels are maintained by either enforcing zero unsatisfied requests, or imposing a joint chance constraint to bound possible failures in a backlogging model. When uncertain requests follow continuous distributions, we employ the Sampling Average Approximation for generating scenario-based requests. Such an approach transforms the original probabilistic model into deterministic mixed-integer linear programs. We further demonstrate computational results of all models by testing instances with different parameter combinations, and investigate how backlogging, unit penalty cost and QoS levels influence computational performances and optimal solutions.\",\"PeriodicalId\":122639,\"journal\":{\"name\":\"2012 IEEE Fifth International Conference on Utility and Cloud Computing\",\"volume\":\"7 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2012-11-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"18\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2012 IEEE Fifth International Conference on Utility and Cloud Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/UCC.2012.37\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 IEEE Fifth International Conference on Utility and Cloud Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/UCC.2012.37","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Risk and Energy Consumption Tradeoffs in Cloud Computing Service via Stochastic Optimization Models
Energy efficiency and computational reliability are two key concerns associated with modern computations that involve computational resource sharing and highly uncertain job arrivals from various sources (customers). In 2010, large-scale data center operations consume around 2% of the total energy use in the US. Meanwhile, the development of cloud computing in the IT industry possesses great potential for lowering energy consumption by partitioning and scheduling job requests among multiple computational servers. In this paper, we formulate stochastic integer programming models to minimize energy consumption of cloud computing servers over finite time periods, while maintaining a pre-specified quality of service (QoS) level for satisfying uncertain computational requests. The models dynamically monitor and predict customer requests for each period, and proactively switch servers on/off according to estimated customer requests. QoS levels are maintained by either enforcing zero unsatisfied requests, or imposing a joint chance constraint to bound possible failures in a backlogging model. When uncertain requests follow continuous distributions, we employ the Sampling Average Approximation for generating scenario-based requests. Such an approach transforms the original probabilistic model into deterministic mixed-integer linear programs. We further demonstrate computational results of all models by testing instances with different parameter combinations, and investigate how backlogging, unit penalty cost and QoS levels influence computational performances and optimal solutions.