3D Positioning System Based on One-handed Thumb Interactions for 3D Annotation Placement

Souichi Tashiro, Hideaki Uchiyama, D. Thomas, R. Taniguchi
{"title":"3D Positioning System Based on One-handed Thumb Interactions for 3D Annotation Placement","authors":"Souichi Tashiro, Hideaki Uchiyama, D. Thomas, R. Taniguchi","doi":"10.1109/VR.2019.8797979","DOIUrl":null,"url":null,"abstract":"This paper presents a 3D positioning system based on one-handed thumb interactions for simple 3D annotation placement with a smart-phone. To place an annotation at a target point in the real environment, the 3D coordinate of the point is computed by interactively selecting the corresponding points in multiple views by users while performing SLAM. Generally, it is difficult for users to precisely select an intended pixel on the touchscreen. Therefore, we propose to compute the 3D coordinate from multiple observations with a robust estimator to have the tolerance to the inaccurate user inputs. In addition, we developed three pixel selection methods based on one-handed thumb interactions. A pixel is selected at the thumb position at a live view in FingAR, the position of a reticle marker at a live view in SnipAR, or that of a movable reticle marker at a freezed view in FreezAR. In the preliminary evaluation, we investigated the 3D positioning accuracy of each method.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VR.2019.8797979","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

This paper presents a 3D positioning system based on one-handed thumb interactions for simple 3D annotation placement with a smart-phone. To place an annotation at a target point in the real environment, the 3D coordinate of the point is computed by interactively selecting the corresponding points in multiple views by users while performing SLAM. Generally, it is difficult for users to precisely select an intended pixel on the touchscreen. Therefore, we propose to compute the 3D coordinate from multiple observations with a robust estimator to have the tolerance to the inaccurate user inputs. In addition, we developed three pixel selection methods based on one-handed thumb interactions. A pixel is selected at the thumb position at a live view in FingAR, the position of a reticle marker at a live view in SnipAR, or that of a movable reticle marker at a freezed view in FreezAR. In the preliminary evaluation, we investigated the 3D positioning accuracy of each method.
基于单手拇指交互的三维标注定位系统
本文提出了一种基于单手拇指交互的智能手机三维定位系统,用于简单的三维注释放置。为了将注释放置在真实环境中的目标点上,在执行SLAM时,通过用户在多个视图中交互选择对应的点来计算该点的三维坐标。一般来说,用户很难在触摸屏上精确地选择一个预期的像素。因此,我们建议使用鲁棒估计器从多个观测值中计算三维坐标,以容忍不准确的用户输入。此外,我们开发了三种基于单手拇指交互的像素选择方法。在FingAR的实时视图中,在拇指位置选择一个像素,在SnipAR的实时视图中选择一个划线标记的位置,或者在FreezAR的冻结视图中选择一个可移动划线标记的位置。在初步评估中,我们考察了每种方法的三维定位精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信