Hanqiu Wang, Zihao Zhan, Haoqi Shan, Siqi Dai, Max Panoff, Shuo Wang
{"title":"GAZEploit: Remote Keystroke Inference Attack by Gaze Estimation from Avatar Views in VR/MR Devices","authors":"Hanqiu Wang, Zihao Zhan, Haoqi Shan, Siqi Dai, Max Panoff, Shuo Wang","doi":"arxiv-2409.08122","DOIUrl":null,"url":null,"abstract":"The advent and growing popularity of Virtual Reality (VR) and Mixed Reality\n(MR) solutions have revolutionized the way we interact with digital platforms.\nThe cutting-edge gaze-controlled typing methods, now prevalent in high-end\nmodels of these devices, e.g., Apple Vision Pro, have not only improved user\nexperience but also mitigated traditional keystroke inference attacks that\nrelied on hand gestures, head movements and acoustic side-channels. However,\nthis advancement has paradoxically given birth to a new, potentially more\ninsidious cyber threat, GAZEploit. In this paper, we unveil GAZEploit, a novel eye-tracking based attack\nspecifically designed to exploit these eye-tracking information by leveraging\nthe common use of virtual appearances in VR applications. This widespread usage\nsignificantly enhances the practicality and feasibility of our attack compared\nto existing methods. GAZEploit takes advantage of this vulnerability to\nremotely extract gaze estimations and steal sensitive keystroke information\nacross various typing scenarios-including messages, passwords, URLs, emails,\nand passcodes. Our research, involving 30 participants, achieved over 80%\naccuracy in keystroke inference. Alarmingly, our study also identified over 15\ntop-rated apps in the Apple Store as vulnerable to the GAZEploit attack,\nemphasizing the urgent need for bolstered security measures for this\nstate-of-the-art VR/MR text entry method.","PeriodicalId":501541,"journal":{"name":"arXiv - CS - Human-Computer Interaction","volume":"11 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Human-Computer Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.08122","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The advent and growing popularity of Virtual Reality (VR) and Mixed Reality
(MR) solutions have revolutionized the way we interact with digital platforms.
The cutting-edge gaze-controlled typing methods, now prevalent in high-end
models of these devices, e.g., Apple Vision Pro, have not only improved user
experience but also mitigated traditional keystroke inference attacks that
relied on hand gestures, head movements and acoustic side-channels. However,
this advancement has paradoxically given birth to a new, potentially more
insidious cyber threat, GAZEploit. In this paper, we unveil GAZEploit, a novel eye-tracking based attack
specifically designed to exploit these eye-tracking information by leveraging
the common use of virtual appearances in VR applications. This widespread usage
significantly enhances the practicality and feasibility of our attack compared
to existing methods. GAZEploit takes advantage of this vulnerability to
remotely extract gaze estimations and steal sensitive keystroke information
across various typing scenarios-including messages, passwords, URLs, emails,
and passcodes. Our research, involving 30 participants, achieved over 80%
accuracy in keystroke inference. Alarmingly, our study also identified over 15
top-rated apps in the Apple Store as vulnerable to the GAZEploit attack,
emphasizing the urgent need for bolstered security measures for this
state-of-the-art VR/MR text entry method.