一种基于汉明距离的激光雷达点云压缩快速运动估计方法

Yuhao An, Yiting Shao, Ge Li, Wei Gao, Shan Liu
{"title":"一种基于汉明距离的激光雷达点云压缩快速运动估计方法","authors":"Yuhao An, Yiting Shao, Ge Li, Wei Gao, Shan Liu","doi":"10.1109/VCIP56404.2022.10008842","DOIUrl":null,"url":null,"abstract":"With more three-dimensional space information, Light detection and ranging (LiDAR) point clouds, which are promising to play more roles in the future, have an urgent need to be efficiently compressed. There are lots of compression methods based on spatial correlations, whereas few studies consider exploiting temporal correlations. In this paper, we propose a different perspective for the motion estimation. In most previous works, geometric distance between matching points was used as the criterion, which has an expensive computational cost and is not accurate. We first propose the Hamming distance between the octree's nodes, instead of the geometric distance between per point which is a more direct criterion. We have implemented our method in the MPEG (Moving Picture Expert Group) Geometry-based PCC (Point Cloud Compression) inter-exploration (G-PCC Inter-EM). Experimental results show our method can provide the average 3.5 % bitrate savings and 92.5 % encoding speed increase in lossless geometric coding, compared to the G-PCC Inter-EM.","PeriodicalId":269379,"journal":{"name":"2022 IEEE International Conference on Visual Communications and Image Processing (VCIP)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"A Fast Motion Estimation Method With Hamming Distance for LiDAR Point Cloud Compression\",\"authors\":\"Yuhao An, Yiting Shao, Ge Li, Wei Gao, Shan Liu\",\"doi\":\"10.1109/VCIP56404.2022.10008842\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"With more three-dimensional space information, Light detection and ranging (LiDAR) point clouds, which are promising to play more roles in the future, have an urgent need to be efficiently compressed. There are lots of compression methods based on spatial correlations, whereas few studies consider exploiting temporal correlations. In this paper, we propose a different perspective for the motion estimation. In most previous works, geometric distance between matching points was used as the criterion, which has an expensive computational cost and is not accurate. We first propose the Hamming distance between the octree's nodes, instead of the geometric distance between per point which is a more direct criterion. We have implemented our method in the MPEG (Moving Picture Expert Group) Geometry-based PCC (Point Cloud Compression) inter-exploration (G-PCC Inter-EM). Experimental results show our method can provide the average 3.5 % bitrate savings and 92.5 % encoding speed increase in lossless geometric coding, compared to the G-PCC Inter-EM.\",\"PeriodicalId\":269379,\"journal\":{\"name\":\"2022 IEEE International Conference on Visual Communications and Image Processing (VCIP)\",\"volume\":\"12 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE International Conference on Visual Communications and Image Processing (VCIP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/VCIP56404.2022.10008842\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Visual Communications and Image Processing (VCIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VCIP56404.2022.10008842","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

随着三维空间信息的增多,激光雷达(LiDAR)点云迫切需要高效压缩,在未来将发挥更大的作用。基于空间相关性的压缩方法很多,而考虑利用时间相关性的研究很少。在本文中,我们提出了一种不同的运动估计视角。在以往的研究中,大多采用匹配点之间的几何距离作为判别标准,计算成本高且不准确。我们首先提出八叉树节点之间的汉明距离,而不是每个点之间的几何距离,这是一个更直接的标准。我们已经在MPEG(运动图像专家组)基于几何的PCC(点云压缩)内部探索(G-PCC Inter-EM)中实现了我们的方法。实验结果表明,与G-PCC Inter-EM相比,我们的方法可以在无损几何编码中平均节省3.5%的比特率,提高92.5%的编码速度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Fast Motion Estimation Method With Hamming Distance for LiDAR Point Cloud Compression
With more three-dimensional space information, Light detection and ranging (LiDAR) point clouds, which are promising to play more roles in the future, have an urgent need to be efficiently compressed. There are lots of compression methods based on spatial correlations, whereas few studies consider exploiting temporal correlations. In this paper, we propose a different perspective for the motion estimation. In most previous works, geometric distance between matching points was used as the criterion, which has an expensive computational cost and is not accurate. We first propose the Hamming distance between the octree's nodes, instead of the geometric distance between per point which is a more direct criterion. We have implemented our method in the MPEG (Moving Picture Expert Group) Geometry-based PCC (Point Cloud Compression) inter-exploration (G-PCC Inter-EM). Experimental results show our method can provide the average 3.5 % bitrate savings and 92.5 % encoding speed increase in lossless geometric coding, compared to the G-PCC Inter-EM.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信