{"title":"面向残疾人的基于注视的移动设备交互","authors":"Anton Wachner, Janick Edinger, Christian Becker","doi":"10.1109/PERCOMW.2018.8480159","DOIUrl":null,"url":null,"abstract":"Manual input is the dominant way to interact with mobile devices in everyday life. Innovations like multi-touch displays and virtual keyboards that allow user input by sliding a finger over the letters make this input more and more convenient. However, these systems assume that users have full control over their hand movements. This assumption excludes millions of people who suffer from loss of extremities, paralyses, or spasticities. In this paper, we motivate the urgent need for input mechanisms that make these technologies available for everybody. We focus on gaze-based interaction as eye movements can be tracked by built-in cameras of unmodified mobile devices without any further hardware requirements. As accurate point of gaze estimation on mobile devices is nontrivial, we propose a middleware that is not based on the absolute position of the pupils but rather takes eye movement patterns into account. This paper has four contributions. First, we motivate the need for alternative inputs methods besides manual input. Second, we discuss existing approaches and reveal their limitations. Third, we propose a middleware that allows gaze-based user interaction with mobile devices. Fourth, we provide a framework that allows developers to easily integrate gaze-based control into mobile applications.","PeriodicalId":190096,"journal":{"name":"2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Towards Gaze-Based Mobile Device Interaction for the Disabled\",\"authors\":\"Anton Wachner, Janick Edinger, Christian Becker\",\"doi\":\"10.1109/PERCOMW.2018.8480159\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Manual input is the dominant way to interact with mobile devices in everyday life. Innovations like multi-touch displays and virtual keyboards that allow user input by sliding a finger over the letters make this input more and more convenient. However, these systems assume that users have full control over their hand movements. This assumption excludes millions of people who suffer from loss of extremities, paralyses, or spasticities. In this paper, we motivate the urgent need for input mechanisms that make these technologies available for everybody. We focus on gaze-based interaction as eye movements can be tracked by built-in cameras of unmodified mobile devices without any further hardware requirements. As accurate point of gaze estimation on mobile devices is nontrivial, we propose a middleware that is not based on the absolute position of the pupils but rather takes eye movement patterns into account. This paper has four contributions. First, we motivate the need for alternative inputs methods besides manual input. Second, we discuss existing approaches and reveal their limitations. Third, we propose a middleware that allows gaze-based user interaction with mobile devices. Fourth, we provide a framework that allows developers to easily integrate gaze-based control into mobile applications.\",\"PeriodicalId\":190096,\"journal\":{\"name\":\"2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)\",\"volume\":\"43 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-03-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/PERCOMW.2018.8480159\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PERCOMW.2018.8480159","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Towards Gaze-Based Mobile Device Interaction for the Disabled
Manual input is the dominant way to interact with mobile devices in everyday life. Innovations like multi-touch displays and virtual keyboards that allow user input by sliding a finger over the letters make this input more and more convenient. However, these systems assume that users have full control over their hand movements. This assumption excludes millions of people who suffer from loss of extremities, paralyses, or spasticities. In this paper, we motivate the urgent need for input mechanisms that make these technologies available for everybody. We focus on gaze-based interaction as eye movements can be tracked by built-in cameras of unmodified mobile devices without any further hardware requirements. As accurate point of gaze estimation on mobile devices is nontrivial, we propose a middleware that is not based on the absolute position of the pupils but rather takes eye movement patterns into account. This paper has four contributions. First, we motivate the need for alternative inputs methods besides manual input. Second, we discuss existing approaches and reveal their limitations. Third, we propose a middleware that allows gaze-based user interaction with mobile devices. Fourth, we provide a framework that allows developers to easily integrate gaze-based control into mobile applications.