{"title":"在普通鼠标使用中添加手势:改进人机交互的新输入方式","authors":"L. Lombardi, M. Porta","doi":"10.1109/ICIAP.2007.20","DOIUrl":null,"url":null,"abstract":"Although the way we interact with computers is substantially the same since twenty years based on keyboard, mouse and window metaphor-machine perception could be usefully exploited to enhance the human-computer communication process. In this paper, we present a vision-based user interface where plain, static hand gestures performed nearby the mouse are interpreted as specific input commands. Our tests demonstrate that this new input modality does not interfere with ordinary mouse use and can speed up task execution, while not requiring too much attention from the user.","PeriodicalId":118466,"journal":{"name":"14th International Conference on Image Analysis and Processing (ICIAP 2007)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Adding Gestures to Ordinary Mouse Use: a New Input Modality for Improved Human-Computer Interaction\",\"authors\":\"L. Lombardi, M. Porta\",\"doi\":\"10.1109/ICIAP.2007.20\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Although the way we interact with computers is substantially the same since twenty years based on keyboard, mouse and window metaphor-machine perception could be usefully exploited to enhance the human-computer communication process. In this paper, we present a vision-based user interface where plain, static hand gestures performed nearby the mouse are interpreted as specific input commands. Our tests demonstrate that this new input modality does not interfere with ordinary mouse use and can speed up task execution, while not requiring too much attention from the user.\",\"PeriodicalId\":118466,\"journal\":{\"name\":\"14th International Conference on Image Analysis and Processing (ICIAP 2007)\",\"volume\":\"19 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2007-09-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"14th International Conference on Image Analysis and Processing (ICIAP 2007)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICIAP.2007.20\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"14th International Conference on Image Analysis and Processing (ICIAP 2007)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIAP.2007.20","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Adding Gestures to Ordinary Mouse Use: a New Input Modality for Improved Human-Computer Interaction
Although the way we interact with computers is substantially the same since twenty years based on keyboard, mouse and window metaphor-machine perception could be usefully exploited to enhance the human-computer communication process. In this paper, we present a vision-based user interface where plain, static hand gestures performed nearby the mouse are interpreted as specific input commands. Our tests demonstrate that this new input modality does not interfere with ordinary mouse use and can speed up task execution, while not requiring too much attention from the user.