LILOC:利用激光雷达在动态室内环境中实现精确的3D定位

Darshana Rathnayake, Meera Radhakrishnan, Inseok Hwang, Archan Misra
{"title":"LILOC:利用激光雷达在动态室内环境中实现精确的3D定位","authors":"Darshana Rathnayake, Meera Radhakrishnan, Inseok Hwang, Archan Misra","doi":"10.1145/3576842.3582364","DOIUrl":null,"url":null,"abstract":"We present LiLoc, a system for precise 3D localization and tracking of mobile IoT devices (e.g., robots) in indoor environments using multi-perspective LiDAR sensing. The key differentiators in our work are: (a) First, unlike traditional localization approaches, our approach is robust to dynamically changing environmental conditions (e.g., varying crowd levels, object placement/layout changes); (b) Second, unlike prior work on visual and 3D SLAM, LiLoc is not dependent on a pre-built static map of the environment and instead works by utilizing dynamically updated point clouds captured from both infrastructural-mounted LiDARs and LiDARs equipped on individual mobile IoT devices. To achieve fine-grained, near real-time location tracking, it employs complex 3D ‘global’ registration among the two point clouds only intermittently to obtain robust spot location estimates and further augments it with repeated simpler ‘local’ registrations to update the trajectory of IoT device continuously. We demonstrate that LiLoc can (a) support accurate location tracking with location and pose estimation error being <=7.4cm and <=3.2° respectively for 84% of the time and the median error increasing only marginally (8%), for correctly estimated trajectories, when the ambient environment is dynamic, (b) achieve a 36% reduction in median location estimation error compared to an approach that uses only quasi-static global point cloud, and (c) obtain spot location estimates with a latency of only 973 msecs.","PeriodicalId":266438,"journal":{"name":"Proceedings of the 8th ACM/IEEE Conference on Internet of Things Design and Implementation","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"LILOC: Enabling Precise 3D Localization in Dynamic Indoor Environments using LiDARs\",\"authors\":\"Darshana Rathnayake, Meera Radhakrishnan, Inseok Hwang, Archan Misra\",\"doi\":\"10.1145/3576842.3582364\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We present LiLoc, a system for precise 3D localization and tracking of mobile IoT devices (e.g., robots) in indoor environments using multi-perspective LiDAR sensing. The key differentiators in our work are: (a) First, unlike traditional localization approaches, our approach is robust to dynamically changing environmental conditions (e.g., varying crowd levels, object placement/layout changes); (b) Second, unlike prior work on visual and 3D SLAM, LiLoc is not dependent on a pre-built static map of the environment and instead works by utilizing dynamically updated point clouds captured from both infrastructural-mounted LiDARs and LiDARs equipped on individual mobile IoT devices. To achieve fine-grained, near real-time location tracking, it employs complex 3D ‘global’ registration among the two point clouds only intermittently to obtain robust spot location estimates and further augments it with repeated simpler ‘local’ registrations to update the trajectory of IoT device continuously. We demonstrate that LiLoc can (a) support accurate location tracking with location and pose estimation error being <=7.4cm and <=3.2° respectively for 84% of the time and the median error increasing only marginally (8%), for correctly estimated trajectories, when the ambient environment is dynamic, (b) achieve a 36% reduction in median location estimation error compared to an approach that uses only quasi-static global point cloud, and (c) obtain spot location estimates with a latency of only 973 msecs.\",\"PeriodicalId\":266438,\"journal\":{\"name\":\"Proceedings of the 8th ACM/IEEE Conference on Internet of Things Design and Implementation\",\"volume\":\"11 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-05-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 8th ACM/IEEE Conference on Internet of Things Design and Implementation\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3576842.3582364\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 8th ACM/IEEE Conference on Internet of Things Design and Implementation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3576842.3582364","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

我们介绍了LiLoc,这是一个使用多视角激光雷达传感在室内环境中对移动物联网设备(如机器人)进行精确3D定位和跟踪的系统。我们工作中的关键区别在于:(a)首先,与传统的定位方法不同,我们的方法对动态变化的环境条件(例如,不同的人群水平,物体放置/布局变化)具有鲁棒性;(b)其次,与之前在视觉和3D SLAM上的工作不同,LiLoc不依赖于预构建的静态环境地图,而是通过利用从安装在基础设施上的激光雷达和单个移动物联网设备上装备的激光雷达捕获的动态更新的点云来工作。为了实现细粒度、接近实时的位置跟踪,它在两个点云之间间歇性地采用复杂的3D“全局”配准,以获得稳健的点位置估计,并通过重复的更简单的“本地”配准进一步增强,以连续更新物联网设备的轨迹。我们证明,LiLoc可以(a)支持精确的位置跟踪,在84%的时间内,位置和姿态估计误差分别<=7.4cm和<=3.2°,对于正确估计的轨迹,当环境是动态的时,中位数误差仅略微增加(8%);(b)与仅使用准静态全局点云的方法相比,中位数位置估计误差减少36%。(c)获得延迟仅为973毫秒的点位置估计。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
LILOC: Enabling Precise 3D Localization in Dynamic Indoor Environments using LiDARs
We present LiLoc, a system for precise 3D localization and tracking of mobile IoT devices (e.g., robots) in indoor environments using multi-perspective LiDAR sensing. The key differentiators in our work are: (a) First, unlike traditional localization approaches, our approach is robust to dynamically changing environmental conditions (e.g., varying crowd levels, object placement/layout changes); (b) Second, unlike prior work on visual and 3D SLAM, LiLoc is not dependent on a pre-built static map of the environment and instead works by utilizing dynamically updated point clouds captured from both infrastructural-mounted LiDARs and LiDARs equipped on individual mobile IoT devices. To achieve fine-grained, near real-time location tracking, it employs complex 3D ‘global’ registration among the two point clouds only intermittently to obtain robust spot location estimates and further augments it with repeated simpler ‘local’ registrations to update the trajectory of IoT device continuously. We demonstrate that LiLoc can (a) support accurate location tracking with location and pose estimation error being <=7.4cm and <=3.2° respectively for 84% of the time and the median error increasing only marginally (8%), for correctly estimated trajectories, when the ambient environment is dynamic, (b) achieve a 36% reduction in median location estimation error compared to an approach that uses only quasi-static global point cloud, and (c) obtain spot location estimates with a latency of only 973 msecs.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信