Utsav Drolia, Katherine Guo, Jiaqi Tan, R. Gandhi, P. Narasimhan
{"title":"Towards edge-caching for image recognition","authors":"Utsav Drolia, Katherine Guo, Jiaqi Tan, R. Gandhi, P. Narasimhan","doi":"10.1109/PERCOMW.2017.7917629","DOIUrl":null,"url":null,"abstract":"With the available sensors on mobile devices and their improved CPU and storage capability, users expect their devices to recognize the surrounding environment and to provide relevant information and/or content automatically and immediately. For such classes of real-time applications, user perception of performance is key. To enable a truly seamless experience for the user, responses to requests need to be provided with minimal user-perceived latency. Current state-of-the-art systems for these applications require offloading requests and data to the cloud. This paper proposes an approach to allow users' devices and their onboard applications to leverage resources closer to home, i.e., resources at the edge of the network. We propose to use edge-servers as specialized caches for image-recognition applications. We develop a detailed formula for the expected latency for such a cache that incorporates the effects of recognition algorithms' computation time and accuracy. We show that, counter-intuitively, large cache sizes can lead to higher latencies. To the best of our knowledge, this is the first work that models edge-servers as caches for compute-intensive recognition applications.","PeriodicalId":319638,"journal":{"name":"2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)","volume":"238 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PERCOMW.2017.7917629","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7
Abstract
With the available sensors on mobile devices and their improved CPU and storage capability, users expect their devices to recognize the surrounding environment and to provide relevant information and/or content automatically and immediately. For such classes of real-time applications, user perception of performance is key. To enable a truly seamless experience for the user, responses to requests need to be provided with minimal user-perceived latency. Current state-of-the-art systems for these applications require offloading requests and data to the cloud. This paper proposes an approach to allow users' devices and their onboard applications to leverage resources closer to home, i.e., resources at the edge of the network. We propose to use edge-servers as specialized caches for image-recognition applications. We develop a detailed formula for the expected latency for such a cache that incorporates the effects of recognition algorithms' computation time and accuracy. We show that, counter-intuitively, large cache sizes can lead to higher latencies. To the best of our knowledge, this is the first work that models edge-servers as caches for compute-intensive recognition applications.