C. Papadopoulos, H. Choi, J. Sinha, Kiwon Yun, A. Kaufman, D. Samaras, B. Laha
{"title":"用于沉浸式环境的实用手心3DUI平台","authors":"C. Papadopoulos, H. Choi, J. Sinha, Kiwon Yun, A. Kaufman, D. Samaras, B. Laha","doi":"10.1109/3DUI.2015.7131722","DOIUrl":null,"url":null,"abstract":"Chirocentric 3D user interfaces are sometimes hailed as the “holy grail” of human-computer interaction. However, implementations of these UIs can require cumbersome devices (such as tethered wearable datagloves), be limited in terms of functionality or obscure the algorithms used for hand pose and gesture recognition. These limitations inhibit designing, deploying and formally evaluating such interfaces. To ameliorate this situation, we describe the implementation of a practical chirocentric UI platform, targeted at immersive virtual environments with infrared tracking systems. Our main contributions are two machine learning techniques for the recognition of hand gestures (trajectories of the user's hands over time) and hand poses (configurations of the user's fingers) based on marker clouds and rigid body data. We report on the preliminary use of our system for the implementation of a bimanual 3DUI for a large immersive tiled display. We conclude with plans on using our system as a platform for the design and evaluation of bimanual chirocentric UIs, based on the Framework for Interaction Fidelity Analysis (FIFA).","PeriodicalId":131267,"journal":{"name":"2015 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"50 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Practical chirocentric 3DUI platform for immersive environments\",\"authors\":\"C. Papadopoulos, H. Choi, J. Sinha, Kiwon Yun, A. Kaufman, D. Samaras, B. Laha\",\"doi\":\"10.1109/3DUI.2015.7131722\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Chirocentric 3D user interfaces are sometimes hailed as the “holy grail” of human-computer interaction. However, implementations of these UIs can require cumbersome devices (such as tethered wearable datagloves), be limited in terms of functionality or obscure the algorithms used for hand pose and gesture recognition. These limitations inhibit designing, deploying and formally evaluating such interfaces. To ameliorate this situation, we describe the implementation of a practical chirocentric UI platform, targeted at immersive virtual environments with infrared tracking systems. Our main contributions are two machine learning techniques for the recognition of hand gestures (trajectories of the user's hands over time) and hand poses (configurations of the user's fingers) based on marker clouds and rigid body data. We report on the preliminary use of our system for the implementation of a bimanual 3DUI for a large immersive tiled display. We conclude with plans on using our system as a platform for the design and evaluation of bimanual chirocentric UIs, based on the Framework for Interaction Fidelity Analysis (FIFA).\",\"PeriodicalId\":131267,\"journal\":{\"name\":\"2015 IEEE Symposium on 3D User Interfaces (3DUI)\",\"volume\":\"50 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-03-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 IEEE Symposium on 3D User Interfaces (3DUI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/3DUI.2015.7131722\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE Symposium on 3D User Interfaces (3DUI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/3DUI.2015.7131722","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Practical chirocentric 3DUI platform for immersive environments
Chirocentric 3D user interfaces are sometimes hailed as the “holy grail” of human-computer interaction. However, implementations of these UIs can require cumbersome devices (such as tethered wearable datagloves), be limited in terms of functionality or obscure the algorithms used for hand pose and gesture recognition. These limitations inhibit designing, deploying and formally evaluating such interfaces. To ameliorate this situation, we describe the implementation of a practical chirocentric UI platform, targeted at immersive virtual environments with infrared tracking systems. Our main contributions are two machine learning techniques for the recognition of hand gestures (trajectories of the user's hands over time) and hand poses (configurations of the user's fingers) based on marker clouds and rigid body data. We report on the preliminary use of our system for the implementation of a bimanual 3DUI for a large immersive tiled display. We conclude with plans on using our system as a platform for the design and evaluation of bimanual chirocentric UIs, based on the Framework for Interaction Fidelity Analysis (FIFA).