M. Carter, Joshua Newn, Eduardo Velloso, F. Vetere
{"title":"微软Kinect上的远程凝视和手势跟踪:反馈作用的调查","authors":"M. Carter, Joshua Newn, Eduardo Velloso, F. Vetere","doi":"10.1145/2838739.2838778","DOIUrl":null,"url":null,"abstract":"In this paper we present the results of a user experience and preference study into the combination of gaze and gesture in a lounge-style remote-interaction, using a novel system that tracks gaze and gesture using only the Kinect device at a distance of 2m from the user. Our results indicate exciting opportunities for gaze-tracking interfaces that use existing technologies, but suggest that findings from studies of highly-accurate gaze systems may not apply in these real-world simulations where the gaze-tracking is inherently less accurate. We contribute a series of design recommendations for gaze and gesture interfaces in this context, and based on these limitations.","PeriodicalId":364334,"journal":{"name":"Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2015-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"26","resultStr":"{\"title\":\"Remote Gaze and Gesture Tracking on the Microsoft Kinect: Investigating the Role of Feedback\",\"authors\":\"M. Carter, Joshua Newn, Eduardo Velloso, F. Vetere\",\"doi\":\"10.1145/2838739.2838778\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper we present the results of a user experience and preference study into the combination of gaze and gesture in a lounge-style remote-interaction, using a novel system that tracks gaze and gesture using only the Kinect device at a distance of 2m from the user. Our results indicate exciting opportunities for gaze-tracking interfaces that use existing technologies, but suggest that findings from studies of highly-accurate gaze systems may not apply in these real-world simulations where the gaze-tracking is inherently less accurate. We contribute a series of design recommendations for gaze and gesture interfaces in this context, and based on these limitations.\",\"PeriodicalId\":364334,\"journal\":{\"name\":\"Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-12-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"26\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2838739.2838778\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2838739.2838778","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Remote Gaze and Gesture Tracking on the Microsoft Kinect: Investigating the Role of Feedback
In this paper we present the results of a user experience and preference study into the combination of gaze and gesture in a lounge-style remote-interaction, using a novel system that tracks gaze and gesture using only the Kinect device at a distance of 2m from the user. Our results indicate exciting opportunities for gaze-tracking interfaces that use existing technologies, but suggest that findings from studies of highly-accurate gaze systems may not apply in these real-world simulations where the gaze-tracking is inherently less accurate. We contribute a series of design recommendations for gaze and gesture interfaces in this context, and based on these limitations.