Jinhan Hu, Aashiq Shaikh, A. Bahremand, R. Likamwa
{"title":"描述移动设备上的实时密集点云捕获和流","authors":"Jinhan Hu, Aashiq Shaikh, A. Bahremand, R. Likamwa","doi":"10.1145/3477083.3480155","DOIUrl":null,"url":null,"abstract":"Point clouds are a dense compilation of millions of points that can advance content creation and interaction in various emerging applications such as Augmented Reality (AR). However, point clouds consist of per-point real-world spatial and color information that are too computationally intensive to meet real-time specifications, especially on mobile devices. To stream dense point cloud (PtCl) to mobile devices, existing solutions encode pre-captured point clouds, yet with PtCl capturing treated as a separate offline operation. To discover more insights, we combine PtCl capturing and streaming as an entire pipeline and build a research prototype to study the bottlenecks of its real-time usage on mobile devices, consisting of a depth sensor with high precision and resolution, an edge-computing development board, and a smartphone. In a custom Unity app, we monitor the latency of each operation from the capturing to the rendering, as well as the energy efficiency of the board and the smartphone working at different point cloud resolutions. Results reveal that a toolset helping users efficiently capture, stream, and process color and depth data is the key enabler to real-time PtCl capturing and streaming on mobile devices.","PeriodicalId":206784,"journal":{"name":"Proceedings of the 3rd ACM Workshop on Hot Topics in Video Analytics and Intelligent Edges","volume":"44 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Characterizing real-time dense point cloud capture and streaming on mobile devices\",\"authors\":\"Jinhan Hu, Aashiq Shaikh, A. Bahremand, R. Likamwa\",\"doi\":\"10.1145/3477083.3480155\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Point clouds are a dense compilation of millions of points that can advance content creation and interaction in various emerging applications such as Augmented Reality (AR). However, point clouds consist of per-point real-world spatial and color information that are too computationally intensive to meet real-time specifications, especially on mobile devices. To stream dense point cloud (PtCl) to mobile devices, existing solutions encode pre-captured point clouds, yet with PtCl capturing treated as a separate offline operation. To discover more insights, we combine PtCl capturing and streaming as an entire pipeline and build a research prototype to study the bottlenecks of its real-time usage on mobile devices, consisting of a depth sensor with high precision and resolution, an edge-computing development board, and a smartphone. In a custom Unity app, we monitor the latency of each operation from the capturing to the rendering, as well as the energy efficiency of the board and the smartphone working at different point cloud resolutions. Results reveal that a toolset helping users efficiently capture, stream, and process color and depth data is the key enabler to real-time PtCl capturing and streaming on mobile devices.\",\"PeriodicalId\":206784,\"journal\":{\"name\":\"Proceedings of the 3rd ACM Workshop on Hot Topics in Video Analytics and Intelligent Edges\",\"volume\":\"44 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-10-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 3rd ACM Workshop on Hot Topics in Video Analytics and Intelligent Edges\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3477083.3480155\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 3rd ACM Workshop on Hot Topics in Video Analytics and Intelligent Edges","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3477083.3480155","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Characterizing real-time dense point cloud capture and streaming on mobile devices
Point clouds are a dense compilation of millions of points that can advance content creation and interaction in various emerging applications such as Augmented Reality (AR). However, point clouds consist of per-point real-world spatial and color information that are too computationally intensive to meet real-time specifications, especially on mobile devices. To stream dense point cloud (PtCl) to mobile devices, existing solutions encode pre-captured point clouds, yet with PtCl capturing treated as a separate offline operation. To discover more insights, we combine PtCl capturing and streaming as an entire pipeline and build a research prototype to study the bottlenecks of its real-time usage on mobile devices, consisting of a depth sensor with high precision and resolution, an edge-computing development board, and a smartphone. In a custom Unity app, we monitor the latency of each operation from the capturing to the rendering, as well as the energy efficiency of the board and the smartphone working at different point cloud resolutions. Results reveal that a toolset helping users efficiently capture, stream, and process color and depth data is the key enabler to real-time PtCl capturing and streaming on mobile devices.