Granit Luzhnica, Jörg Simon, E. Lex, Viktoria Pammer-Schindler
{"title":"A sliding window approach to natural hand gesture recognition using a custom data glove","authors":"Granit Luzhnica, Jörg Simon, E. Lex, Viktoria Pammer-Schindler","doi":"10.1109/3DUI.2016.7460035","DOIUrl":"https://doi.org/10.1109/3DUI.2016.7460035","url":null,"abstract":"This paper explores the recognition of hand gestures based on a data glove equipped with motion, bending and pressure sensors. We selected 31 natural and interaction-oriented hand gestures that can be adopted for general-purpose control of and communication with computing systems. The data glove is custom-built, and contains 13 bend sensors, 7 motion sensors, 5 pressure sensors and a magnetometer. We present the data collection experiment, as well as the design, selection and evaluation of a classification algorithm. As we use a sliding window approach to data processing, our algorithm is suitable for stream data processing. Algorithm selection and feature engineering resulted in a combination of linear discriminant analysis and logistic regression with which we achieve an accuracy of over 98.5% on a continuous data stream scenario. When removing the computationally expensive FFT-based features, we still achieve an accuracy of 98.2%.","PeriodicalId":175060,"journal":{"name":"2016 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133667203","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Ray, camera, action! A technique for collaborative 3D manipulation","authors":"W. Lages","doi":"10.1109/3DUI.2016.7460080","DOIUrl":"https://doi.org/10.1109/3DUI.2016.7460080","url":null,"abstract":"We present a technique to support collaborative 3D manipulation. Our approach is based on two or more users jointly specifying the parameters of each transformation using a point, a ray, and a scalar value. We discuss how this concept can be coupled with a camera system to create a scalable technique that can accommodate both parallel and serial collaboration.","PeriodicalId":175060,"journal":{"name":"2016 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"144 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116369376","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Effect of HMD latency on human stability during quiescent standing on one foot","authors":"S. Kawamura, R. Kijima","doi":"10.1109/3DUI.2016.7460044","DOIUrl":"https://doi.org/10.1109/3DUI.2016.7460044","url":null,"abstract":"Latency, measured from the user's motion to the display output, causes virtual reality cybersickness and decreases task performance. However, the effect of small delay has not been investigated sufficiently. Therefore the purpose of this study was to reveal the effect of a small latency on the subjects. The subjects were asked to stand on the force plate with one foot so that the length of body sway can be measured with several lags ranging from 1 ms to 66 ms. The experiments showed that 1) the sway increased linearly as the latency got longer 2) the HMD with latency of 1 ms also degraded the sense of balance compared to the naked eye with the same limited field of view (FOV) as in HMD, and 3) the difference between the virtual and real worlds' content had an effect on the result of the experiment. From these results, user's stability can be regarded as the direct index of the quality of VR system for the user.","PeriodicalId":175060,"journal":{"name":"2016 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121681431","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Toward vibrotactile rendering for irregular 2D tactor arrays","authors":"Nicholas G. Lipari, C. Borst","doi":"10.1109/3DUI.2016.7460068","DOIUrl":"https://doi.org/10.1109/3DUI.2016.7460068","url":null,"abstract":"We motivate further study of vibrotactile rendering schemes for the sensation of arbitrary points in irregular grids or meshes, outline a conceptual approach, and propose a study for assessing and comparing approaches. A conceptual model presents the combination of vibrations from multiple elements (tactors) as a two-stage pairing of tactors into virtual tactors, considering the 2D dimensionality. To support irregular triangle meshes, we suggest parameters to characterize triangle shape and a future study to measure sensations for varying shape. Gathered data will be used to assess and compare perceptual combination models and to develop precise rendering functions for irregular triangle meshes.","PeriodicalId":175060,"journal":{"name":"2016 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"226 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115277218","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Smartwatch-assisted robust 6-DOF hand tracker for object manipulation in HMD-based augmented reality","authors":"Hyung-il Kim, Woontack Woo","doi":"10.1109/3DUI.2016.7460065","DOIUrl":"https://doi.org/10.1109/3DUI.2016.7460065","url":null,"abstract":"We introduce a smartwatch assisted sensor fusion approach to robustly track 6-DOF hand movement in head mounted display (HMD) based augmented reality (AR) environment, which can be used for robust 3D object manipulation. Our method uses a wrist-worn smartwatch with HMD-mounted depth sensor to robustly track 3D position and orientation of user's hand. We introduce HMD-based augmented reality platform with smartwatch, and method to accurately calibrate orientation between smartwatch and HMD. We also implement natural 3D object manipulating system using 6-DOF hand tracker with hand grasping detection. Our proposed system is easy to use, and doesn't require any hand held devices.","PeriodicalId":175060,"journal":{"name":"2016 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126308778","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Grandi, Iago U. Berndt, H. Debarba, L. Nedel, Anderson Maciel
{"title":"Collaborative 3D manipulation using mobile phones","authors":"J. Grandi, Iago U. Berndt, H. Debarba, L. Nedel, Anderson Maciel","doi":"10.1109/3DUI.2016.7460079","DOIUrl":"https://doi.org/10.1109/3DUI.2016.7460079","url":null,"abstract":"We present a 3D user interface for collaborative manipulation of three-dimensional objects in virtual environments. It maps inertial sensors, touch screen and physical buttons of a mobile phone into well-known gestures to alter the position, rotation and scale of virtual objects. As these transformations require the control of multiple degrees of freedom (DOFs), collaboration is proposed as a solution to coordinate the modification of each and all the available DOFs. Users are free to decide their own manipulation roles. All virtual elements are displayed in a single shared screen, which is handy to aggregate multiple users in the same physical space.","PeriodicalId":175060,"journal":{"name":"2016 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128722364","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}