{"title":"EyePhone: activating mobile phones with your eyes","authors":"E. Miluzzo, Tianyu Wang, A. Campbell","doi":"10.1145/1851322.1851328","DOIUrl":null,"url":null,"abstract":"As smartphones evolve researchers are studying new techniques to ease the human-mobile interaction. We propose EyePhone, a novel \"hand-free\" interfacing system capable of driving mobile applications/functions using only the user's eyes movement and actions (e.g., wink). EyePhone tracks the user's eye movement across the phone's display using the camera mounted on the front of the phone; more specifically, machine learning algorithms are used to: i) track the eye and infer its position on the mobile phone display as a user views a particular application; and ii) detect eye blinks that emulate mouse clicks to activate the target application under view. We present a prototype implementation of EyePhone on a Nokia N810, which is capable of tracking the position of the eye on the display, mapping this positions to an application that is activated by a wink. At no time does the user have to physically touch the phone display.","PeriodicalId":219018,"journal":{"name":"Networking, Systems, and Applications for Mobile Handhelds","volume":"135 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"148","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Networking, Systems, and Applications for Mobile Handhelds","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/1851322.1851328","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 148
Abstract
As smartphones evolve researchers are studying new techniques to ease the human-mobile interaction. We propose EyePhone, a novel "hand-free" interfacing system capable of driving mobile applications/functions using only the user's eyes movement and actions (e.g., wink). EyePhone tracks the user's eye movement across the phone's display using the camera mounted on the front of the phone; more specifically, machine learning algorithms are used to: i) track the eye and infer its position on the mobile phone display as a user views a particular application; and ii) detect eye blinks that emulate mouse clicks to activate the target application under view. We present a prototype implementation of EyePhone on a Nokia N810, which is capable of tracking the position of the eye on the display, mapping this positions to an application that is activated by a wink. At no time does the user have to physically touch the phone display.