PointIt3D: a benchmark dataset and baseline for pointed object detection task

Chun-Tse Lin, Hongxin Zhang, Hao Zheng
{"title":"PointIt3D: a benchmark dataset and baseline for pointed object detection task","authors":"Chun-Tse Lin, Hongxin Zhang, Hao Zheng","doi":"10.1117/12.2645330","DOIUrl":null,"url":null,"abstract":"Pointed object detection is of great importance for human-machine interaction, but attempts to solve this task may run into the difficulties of lack of available large scale datasets since people hardly record 3D scenes with a human pointing at specific objects. In efforts to mitigate this gap, we cultivate the first benchmark dataset for this task: PointIt3D (available at https://pan.baidu.com/share/init?surl=E3u96E7dEXnrR1dDris_1w (access code: jps5)), containing 347 scans now and can be easily scaled up to facilitate future utilizations, which is automatically constructed from existing 3D scenes from ScanNet1 and 3D people models using our novel synthetic algorithm that achieves a high acceptable rate of more than 85% according to three experts’ assessments, which hopefully would pave the way for further studies. We also provide a simple yet effective baseline based on anomaly detection and majority voting pointline generation to solve this task based on our dataset, which achieves accuracy of 55.33%, leaving much room for further improvements. Code will be released at https://github.com/XHRlyb/PointIt3D.","PeriodicalId":314555,"journal":{"name":"International Conference on Digital Image Processing","volume":"14 1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Digital Image Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2645330","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Pointed object detection is of great importance for human-machine interaction, but attempts to solve this task may run into the difficulties of lack of available large scale datasets since people hardly record 3D scenes with a human pointing at specific objects. In efforts to mitigate this gap, we cultivate the first benchmark dataset for this task: PointIt3D (available at https://pan.baidu.com/share/init?surl=E3u96E7dEXnrR1dDris_1w (access code: jps5)), containing 347 scans now and can be easily scaled up to facilitate future utilizations, which is automatically constructed from existing 3D scenes from ScanNet1 and 3D people models using our novel synthetic algorithm that achieves a high acceptable rate of more than 85% according to three experts’ assessments, which hopefully would pave the way for further studies. We also provide a simple yet effective baseline based on anomaly detection and majority voting pointline generation to solve this task based on our dataset, which achieves accuracy of 55.33%, leaving much room for further improvements. Code will be released at https://github.com/XHRlyb/PointIt3D.
PointIt3D:点目标检测任务的基准数据集和基线
点目标检测对于人机交互非常重要,但由于人们很少记录人类指向特定物体的3D场景,因此尝试解决这一任务可能会遇到缺乏可用的大规模数据集的困难。为了缩小这一差距,我们为这项任务培养了第一个基准数据集:PointIt3D(可在https://pan.baidu.com/share/init?surl=E3u96E7dEXnrR1dDris_1w获得)(访问代码:jps5)),现在包含347个扫描,并且可以很容易地扩展以方便将来的利用,它是由ScanNet1和3D人物模型的现有3D场景自动构建的,使用我们新颖的合成算法,根据三位专家的评估,达到了85%以上的高可接受率,这有望为进一步的研究铺平道路。我们还提供了一个简单而有效的基于异常检测和多数投票点线生成的基线来解决基于我们数据集的这个任务,准确率达到55.33%,还有很大的改进空间。代码将在https://github.com/XHRlyb/PointIt3D上发布。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信