3D Finger Rotation Estimation from Fingerprint Images

Q1 Social Sciences
Yongjie Duan, Jinyang Yu, Jianjiang Feng, Ke He, Jiwen Lu, Jie Zhou
{"title":"3D Finger Rotation Estimation from Fingerprint Images","authors":"Yongjie Duan, Jinyang Yu, Jianjiang Feng, Ke He, Jiwen Lu, Jie Zhou","doi":"10.1145/3626467","DOIUrl":null,"url":null,"abstract":"Various touch-based interaction techniques have been developed to make interactions on mobile devices more effective, efficient, and intuitive. Finger orientation, especially, has attracted a lot of attentions since it intuitively brings three additional degrees of freedom (DOF) compared with two-dimensional (2D) touching points. The mapping of finger orientation can be classified as being either absolute or relative, suitable for different interaction applications. However, only absolute orientation has been explored in prior works. The relative angles can be calculated based on two estimated absolute orientations, although, a higher accuracy is expected by predicting relative rotation from input images directly. Consequently, in this paper, we propose to estimate complete 3D relative finger angles based on two fingerprint images, which incorporate more information with a higher image resolution than capacitive images. For algorithm training and evaluation, we constructed a dataset consisting of fingerprint images and their corresponding ground truth 3D relative finger rotation angles. Experimental results on this dataset revealed that our method outperforms previous approaches with absolute finger angle models. Further, extensive experiments were conducted to explore the impact of image resolutions, finger types, and rotation ranges on performance. A user study was also conducted to examine the efficiency and precision using 3D relative finger orientation in 3D object rotation task.","PeriodicalId":36902,"journal":{"name":"Proceedings of the ACM on Human-Computer Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ACM on Human-Computer Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3626467","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 0

Abstract

Various touch-based interaction techniques have been developed to make interactions on mobile devices more effective, efficient, and intuitive. Finger orientation, especially, has attracted a lot of attentions since it intuitively brings three additional degrees of freedom (DOF) compared with two-dimensional (2D) touching points. The mapping of finger orientation can be classified as being either absolute or relative, suitable for different interaction applications. However, only absolute orientation has been explored in prior works. The relative angles can be calculated based on two estimated absolute orientations, although, a higher accuracy is expected by predicting relative rotation from input images directly. Consequently, in this paper, we propose to estimate complete 3D relative finger angles based on two fingerprint images, which incorporate more information with a higher image resolution than capacitive images. For algorithm training and evaluation, we constructed a dataset consisting of fingerprint images and their corresponding ground truth 3D relative finger rotation angles. Experimental results on this dataset revealed that our method outperforms previous approaches with absolute finger angle models. Further, extensive experiments were conducted to explore the impact of image resolutions, finger types, and rotation ranges on performance. A user study was also conducted to examine the efficiency and precision using 3D relative finger orientation in 3D object rotation task.
基于指纹图像的三维手指旋转估计
各种基于触摸的交互技术已经被开发出来,使移动设备上的交互更加有效、高效和直观。特别是手指方向,因为它直观地带来了三个额外的自由度(DOF)与二维(2D)触摸点相比,已经引起了很多关注。手指方向的映射可以分为绝对或相对,适用于不同的交互应用。然而,在以往的工作中,只对绝对定向进行了探索。相对角度可以根据两个估计的绝对方向来计算,尽管直接从输入图像预测相对旋转可以获得更高的精度。因此,在本文中,我们提出基于两张指纹图像估计完整的三维相对手指角度,这比电容图像包含更多的信息和更高的图像分辨率。为了训练和评估算法,我们构建了一个由指纹图像及其对应的地面真实度三维相对手指旋转角度组成的数据集。在该数据集上的实验结果表明,我们的方法优于以前的绝对手指角度模型方法。此外,我们还进行了大量的实验来探索图像分辨率、手指类型和旋转范围对性能的影响。通过实验验证了三维手指相对定位在三维物体旋转任务中的效率和精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Proceedings of the ACM on Human-Computer Interaction
Proceedings of the ACM on Human-Computer Interaction Social Sciences-Social Sciences (miscellaneous)
CiteScore
5.90
自引率
0.00%
发文量
257
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信