{"title":"利用缓存技术提高智能传输服务的延迟和带宽","authors":"Bouchaib Assila, A. Kobbane","doi":"10.1109/wincom47513.2019.8942419","DOIUrl":null,"url":null,"abstract":"On-demand services such as traffic management and video streaming are typical Intelligent transport systems (ITS) requiring very low latency and high bandwidth running. Vehicular Ad hoc NETworks (VANETs) represent important opportunities to exploit content that virtual service providers (VSPs) offers. The concept of “Cache as a Service (CaaS)” is a promising technique to minimize the average latency while satisfying the QoE requirements of vehicles. In this paper, we mainly consider the problem of optimizing latency in ITS, exploiting a completely virtualized environment and caching feature. In this perspective, virtual service providers (VSPs) and mobile virtual network operators (MVNOs) are connected in the Cloud through network as a service (NaaS), using distributed infrastructure as a service (IaaS). VSPs provide services according to VANETS movement. The VANET, as service requester will take advantage of emerging caching techniques (CaaS) to accomplish the on-demand low-latency services requiring computing resources and bandwidth. Consequently, we propose a many-to-many matching strategy coupled to CaaS caching capabilities on distributed F-RAN between VANET and virtual service provider (VSP). We exploit the deferred acceptance algorithm to solve this game. To highlight the effectiveness of our approach, we applied it on two typical on-demand services requiring ultra-reliability and low-latency communications (uRLLC): The intelligent transport and the video streaming services. The simulation results demonstrate the effectiveness of our approach in terms of improved latency and bandwidth optimization and especially during periods of traffic congestion.","PeriodicalId":222207,"journal":{"name":"2019 International Conference on Wireless Networks and Mobile Communications (WINCOM)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Improving Latency And Bandwidth For Intelligent Transport Services Exploiting Caching Technology\",\"authors\":\"Bouchaib Assila, A. Kobbane\",\"doi\":\"10.1109/wincom47513.2019.8942419\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"On-demand services such as traffic management and video streaming are typical Intelligent transport systems (ITS) requiring very low latency and high bandwidth running. Vehicular Ad hoc NETworks (VANETs) represent important opportunities to exploit content that virtual service providers (VSPs) offers. The concept of “Cache as a Service (CaaS)” is a promising technique to minimize the average latency while satisfying the QoE requirements of vehicles. In this paper, we mainly consider the problem of optimizing latency in ITS, exploiting a completely virtualized environment and caching feature. In this perspective, virtual service providers (VSPs) and mobile virtual network operators (MVNOs) are connected in the Cloud through network as a service (NaaS), using distributed infrastructure as a service (IaaS). VSPs provide services according to VANETS movement. The VANET, as service requester will take advantage of emerging caching techniques (CaaS) to accomplish the on-demand low-latency services requiring computing resources and bandwidth. Consequently, we propose a many-to-many matching strategy coupled to CaaS caching capabilities on distributed F-RAN between VANET and virtual service provider (VSP). We exploit the deferred acceptance algorithm to solve this game. To highlight the effectiveness of our approach, we applied it on two typical on-demand services requiring ultra-reliability and low-latency communications (uRLLC): The intelligent transport and the video streaming services. The simulation results demonstrate the effectiveness of our approach in terms of improved latency and bandwidth optimization and especially during periods of traffic congestion.\",\"PeriodicalId\":222207,\"journal\":{\"name\":\"2019 International Conference on Wireless Networks and Mobile Communications (WINCOM)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 International Conference on Wireless Networks and Mobile Communications (WINCOM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/wincom47513.2019.8942419\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Conference on Wireless Networks and Mobile Communications (WINCOM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/wincom47513.2019.8942419","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Improving Latency And Bandwidth For Intelligent Transport Services Exploiting Caching Technology
On-demand services such as traffic management and video streaming are typical Intelligent transport systems (ITS) requiring very low latency and high bandwidth running. Vehicular Ad hoc NETworks (VANETs) represent important opportunities to exploit content that virtual service providers (VSPs) offers. The concept of “Cache as a Service (CaaS)” is a promising technique to minimize the average latency while satisfying the QoE requirements of vehicles. In this paper, we mainly consider the problem of optimizing latency in ITS, exploiting a completely virtualized environment and caching feature. In this perspective, virtual service providers (VSPs) and mobile virtual network operators (MVNOs) are connected in the Cloud through network as a service (NaaS), using distributed infrastructure as a service (IaaS). VSPs provide services according to VANETS movement. The VANET, as service requester will take advantage of emerging caching techniques (CaaS) to accomplish the on-demand low-latency services requiring computing resources and bandwidth. Consequently, we propose a many-to-many matching strategy coupled to CaaS caching capabilities on distributed F-RAN between VANET and virtual service provider (VSP). We exploit the deferred acceptance algorithm to solve this game. To highlight the effectiveness of our approach, we applied it on two typical on-demand services requiring ultra-reliability and low-latency communications (uRLLC): The intelligent transport and the video streaming services. The simulation results demonstrate the effectiveness of our approach in terms of improved latency and bandwidth optimization and especially during periods of traffic congestion.