{"title":"深度相机检测手指方向的可行性分析","authors":"Sven Mayer, Michael Mayer, N. Henze","doi":"10.1145/3098279.3122125","DOIUrl":null,"url":null,"abstract":"Over the last decade, a body of research investigated enriching touch actions by using finger orientation as an additional input. Beyond new interaction techniques, we envision new user interface elements to make use of the additional input information. We define the fingers orientation by the pitch, roll, and yaw on the touch surface. Determining the finger orientation is not possible using current state-of-the-art devices. As a first step, we built a system that can determine the finger orientation. We developed a working prototype with a depth camera mounted on a tablet. We conducted a study with 12 participants to record ground truth data for the index, middle, ring and little finger to evaluate the accuracy of our prototype using the PointPose [3] algorithm to estimate the pitch and yaw of the finger. By applying 2D linear correction models, we further show a reduction of RMSE by 45.4% for pitch and 21.83% for yaw.","PeriodicalId":120153,"journal":{"name":"Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2017-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":"{\"title\":\"Feasibility analysis of detecting the finger orientation with depth cameras\",\"authors\":\"Sven Mayer, Michael Mayer, N. Henze\",\"doi\":\"10.1145/3098279.3122125\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Over the last decade, a body of research investigated enriching touch actions by using finger orientation as an additional input. Beyond new interaction techniques, we envision new user interface elements to make use of the additional input information. We define the fingers orientation by the pitch, roll, and yaw on the touch surface. Determining the finger orientation is not possible using current state-of-the-art devices. As a first step, we built a system that can determine the finger orientation. We developed a working prototype with a depth camera mounted on a tablet. We conducted a study with 12 participants to record ground truth data for the index, middle, ring and little finger to evaluate the accuracy of our prototype using the PointPose [3] algorithm to estimate the pitch and yaw of the finger. By applying 2D linear correction models, we further show a reduction of RMSE by 45.4% for pitch and 21.83% for yaw.\",\"PeriodicalId\":120153,\"journal\":{\"name\":\"Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-09-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"13\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3098279.3122125\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3098279.3122125","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Feasibility analysis of detecting the finger orientation with depth cameras
Over the last decade, a body of research investigated enriching touch actions by using finger orientation as an additional input. Beyond new interaction techniques, we envision new user interface elements to make use of the additional input information. We define the fingers orientation by the pitch, roll, and yaw on the touch surface. Determining the finger orientation is not possible using current state-of-the-art devices. As a first step, we built a system that can determine the finger orientation. We developed a working prototype with a depth camera mounted on a tablet. We conducted a study with 12 participants to record ground truth data for the index, middle, ring and little finger to evaluate the accuracy of our prototype using the PointPose [3] algorithm to estimate the pitch and yaw of the finger. By applying 2D linear correction models, we further show a reduction of RMSE by 45.4% for pitch and 21.83% for yaw.