{"title":"Zero-shot Learning on Gesture Movement for Interactive Dashboard Control","authors":"Wen Lin Yong, Y. Tew, J. Chaw","doi":"10.1109/ISPACS57703.2022.10082836","DOIUrl":null,"url":null,"abstract":"Human-computer interaction (HCI), is always the mainstream in computer technology that concentrates on the communication between humans and computer. Gesture-based HCI, which sounded so modern and zippy at the time, sounds retro now. Although the idea of gesture based HCI is nothing new, this topic is still in vogue. HCI studies continually emphasize the user experience especially when it is implemented in a real-world environment. As known that every individual acts differently and more uncontrollable environmental variables might affect the performance to detect and react to the gesture performed. Even though there are many solutions and datasets proposed in the market, not each of them perfectly fitted to our needs. Hence, to propose a more tailored made gesture detection for own use, the existing zeroshot learning model will be tested on the gesture dataset introduced in this work to fine tune to own needs. The result shows that our proposed I2Hub dataset has higher accuracy compared to EgoGesture dataset (~1.01), but the elapsed time takes longer due to the higher average number of videos in each gesture action.","PeriodicalId":410603,"journal":{"name":"2022 International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISPACS57703.2022.10082836","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Human-computer interaction (HCI), is always the mainstream in computer technology that concentrates on the communication between humans and computer. Gesture-based HCI, which sounded so modern and zippy at the time, sounds retro now. Although the idea of gesture based HCI is nothing new, this topic is still in vogue. HCI studies continually emphasize the user experience especially when it is implemented in a real-world environment. As known that every individual acts differently and more uncontrollable environmental variables might affect the performance to detect and react to the gesture performed. Even though there are many solutions and datasets proposed in the market, not each of them perfectly fitted to our needs. Hence, to propose a more tailored made gesture detection for own use, the existing zeroshot learning model will be tested on the gesture dataset introduced in this work to fine tune to own needs. The result shows that our proposed I2Hub dataset has higher accuracy compared to EgoGesture dataset (~1.01), but the elapsed time takes longer due to the higher average number of videos in each gesture action.