{"title":"基于投影的智能家居环境用户界面","authors":"Chin-Yang Lin, Yi-Bin Lin","doi":"10.1109/COMPSACW.2013.117","DOIUrl":null,"url":null,"abstract":"Nowadays, technology has made it relatively easy to build a smart home environment. To better interact with the devices and services available in such a smart environment, the user interface plays an important role. In view of this, this paper presents a vision and projection-based user interface that allows the user to remotely control the devices through the movement of fingertips over a projected surface, thereby offering a more natural and intuitive way of user interaction. To realize the proposed bare hand interface, a real-time fingertips detection method and a fingertips tracking method are proposed, where the fingertips detection is accomplished by applying computer vision techniques to the images acquired by a camera. In addition, we propose a gesture recognition model to enable a mapping from the position of fingertips to the high level user actions and events, so that several common gestures (e.g. tap/click and drag) can be correctly detected. To demonstrate the feasibility of the proposed approach, a proof-of-concept experiment, a prototype of the projection-based user interface, is conducted and presented based on a typical smart home scenario.","PeriodicalId":152957,"journal":{"name":"2013 IEEE 37th Annual Computer Software and Applications Conference Workshops","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-07-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Projection-Based User Interface for Smart Home Environments\",\"authors\":\"Chin-Yang Lin, Yi-Bin Lin\",\"doi\":\"10.1109/COMPSACW.2013.117\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Nowadays, technology has made it relatively easy to build a smart home environment. To better interact with the devices and services available in such a smart environment, the user interface plays an important role. In view of this, this paper presents a vision and projection-based user interface that allows the user to remotely control the devices through the movement of fingertips over a projected surface, thereby offering a more natural and intuitive way of user interaction. To realize the proposed bare hand interface, a real-time fingertips detection method and a fingertips tracking method are proposed, where the fingertips detection is accomplished by applying computer vision techniques to the images acquired by a camera. In addition, we propose a gesture recognition model to enable a mapping from the position of fingertips to the high level user actions and events, so that several common gestures (e.g. tap/click and drag) can be correctly detected. To demonstrate the feasibility of the proposed approach, a proof-of-concept experiment, a prototype of the projection-based user interface, is conducted and presented based on a typical smart home scenario.\",\"PeriodicalId\":152957,\"journal\":{\"name\":\"2013 IEEE 37th Annual Computer Software and Applications Conference Workshops\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-07-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2013 IEEE 37th Annual Computer Software and Applications Conference Workshops\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/COMPSACW.2013.117\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 IEEE 37th Annual Computer Software and Applications Conference Workshops","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/COMPSACW.2013.117","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Projection-Based User Interface for Smart Home Environments
Nowadays, technology has made it relatively easy to build a smart home environment. To better interact with the devices and services available in such a smart environment, the user interface plays an important role. In view of this, this paper presents a vision and projection-based user interface that allows the user to remotely control the devices through the movement of fingertips over a projected surface, thereby offering a more natural and intuitive way of user interaction. To realize the proposed bare hand interface, a real-time fingertips detection method and a fingertips tracking method are proposed, where the fingertips detection is accomplished by applying computer vision techniques to the images acquired by a camera. In addition, we propose a gesture recognition model to enable a mapping from the position of fingertips to the high level user actions and events, so that several common gestures (e.g. tap/click and drag) can be correctly detected. To demonstrate the feasibility of the proposed approach, a proof-of-concept experiment, a prototype of the projection-based user interface, is conducted and presented based on a typical smart home scenario.