{"title":"海报:扩展边缘推理的深度神经网络优化†","authors":"Bingqian Lu, Jianyi Yang, Shaolei Ren","doi":"10.1109/SEC50012.2020.00025","DOIUrl":null,"url":null,"abstract":"Deep neural networks (DNNs) have been increasingly deployed on and integrated with edge devices, such as mobile phones, drones, robots and wearables. Compared to cloud-based inference, running DNN inference directly on edge devices (a.k. a. edge inference) has major advantages, including being free from the network connection requirement, saving bandwidths, and better protecting user privacy [1].","PeriodicalId":375577,"journal":{"name":"2020 IEEE/ACM Symposium on Edge Computing (SEC)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Poster: Scaling Up Deep Neural Network optimization for Edge Inference†\",\"authors\":\"Bingqian Lu, Jianyi Yang, Shaolei Ren\",\"doi\":\"10.1109/SEC50012.2020.00025\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deep neural networks (DNNs) have been increasingly deployed on and integrated with edge devices, such as mobile phones, drones, robots and wearables. Compared to cloud-based inference, running DNN inference directly on edge devices (a.k. a. edge inference) has major advantages, including being free from the network connection requirement, saving bandwidths, and better protecting user privacy [1].\",\"PeriodicalId\":375577,\"journal\":{\"name\":\"2020 IEEE/ACM Symposium on Edge Computing (SEC)\",\"volume\":\"4 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE/ACM Symposium on Edge Computing (SEC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SEC50012.2020.00025\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE/ACM Symposium on Edge Computing (SEC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SEC50012.2020.00025","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Poster: Scaling Up Deep Neural Network optimization for Edge Inference†
Deep neural networks (DNNs) have been increasingly deployed on and integrated with edge devices, such as mobile phones, drones, robots and wearables. Compared to cloud-based inference, running DNN inference directly on edge devices (a.k. a. edge inference) has major advantages, including being free from the network connection requirement, saving bandwidths, and better protecting user privacy [1].