GAZEploit:通过 VR/MR 设备中头像视图的凝视估计进行远程按键推理攻击

Hanqiu Wang, Zihao Zhan, Haoqi Shan, Siqi Dai, Max Panoff, Shuo Wang
{"title":"GAZEploit:通过 VR/MR 设备中头像视图的凝视估计进行远程按键推理攻击","authors":"Hanqiu Wang, Zihao Zhan, Haoqi Shan, Siqi Dai, Max Panoff, Shuo Wang","doi":"arxiv-2409.08122","DOIUrl":null,"url":null,"abstract":"The advent and growing popularity of Virtual Reality (VR) and Mixed Reality\n(MR) solutions have revolutionized the way we interact with digital platforms.\nThe cutting-edge gaze-controlled typing methods, now prevalent in high-end\nmodels of these devices, e.g., Apple Vision Pro, have not only improved user\nexperience but also mitigated traditional keystroke inference attacks that\nrelied on hand gestures, head movements and acoustic side-channels. However,\nthis advancement has paradoxically given birth to a new, potentially more\ninsidious cyber threat, GAZEploit. In this paper, we unveil GAZEploit, a novel eye-tracking based attack\nspecifically designed to exploit these eye-tracking information by leveraging\nthe common use of virtual appearances in VR applications. This widespread usage\nsignificantly enhances the practicality and feasibility of our attack compared\nto existing methods. GAZEploit takes advantage of this vulnerability to\nremotely extract gaze estimations and steal sensitive keystroke information\nacross various typing scenarios-including messages, passwords, URLs, emails,\nand passcodes. Our research, involving 30 participants, achieved over 80%\naccuracy in keystroke inference. Alarmingly, our study also identified over 15\ntop-rated apps in the Apple Store as vulnerable to the GAZEploit attack,\nemphasizing the urgent need for bolstered security measures for this\nstate-of-the-art VR/MR text entry method.","PeriodicalId":501541,"journal":{"name":"arXiv - CS - Human-Computer Interaction","volume":"11 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"GAZEploit: Remote Keystroke Inference Attack by Gaze Estimation from Avatar Views in VR/MR Devices\",\"authors\":\"Hanqiu Wang, Zihao Zhan, Haoqi Shan, Siqi Dai, Max Panoff, Shuo Wang\",\"doi\":\"arxiv-2409.08122\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The advent and growing popularity of Virtual Reality (VR) and Mixed Reality\\n(MR) solutions have revolutionized the way we interact with digital platforms.\\nThe cutting-edge gaze-controlled typing methods, now prevalent in high-end\\nmodels of these devices, e.g., Apple Vision Pro, have not only improved user\\nexperience but also mitigated traditional keystroke inference attacks that\\nrelied on hand gestures, head movements and acoustic side-channels. However,\\nthis advancement has paradoxically given birth to a new, potentially more\\ninsidious cyber threat, GAZEploit. In this paper, we unveil GAZEploit, a novel eye-tracking based attack\\nspecifically designed to exploit these eye-tracking information by leveraging\\nthe common use of virtual appearances in VR applications. This widespread usage\\nsignificantly enhances the practicality and feasibility of our attack compared\\nto existing methods. GAZEploit takes advantage of this vulnerability to\\nremotely extract gaze estimations and steal sensitive keystroke information\\nacross various typing scenarios-including messages, passwords, URLs, emails,\\nand passcodes. Our research, involving 30 participants, achieved over 80%\\naccuracy in keystroke inference. Alarmingly, our study also identified over 15\\ntop-rated apps in the Apple Store as vulnerable to the GAZEploit attack,\\nemphasizing the urgent need for bolstered security measures for this\\nstate-of-the-art VR/MR text entry method.\",\"PeriodicalId\":501541,\"journal\":{\"name\":\"arXiv - CS - Human-Computer Interaction\",\"volume\":\"11 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Human-Computer Interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.08122\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Human-Computer Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.08122","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

虚拟现实(VR)和混合现实(MR)解决方案的出现和日益普及,彻底改变了我们与数字平台的交互方式。最先进的凝视控制输入法目前在这些设备的高端机型(如苹果Vision Pro)中非常普遍,不仅改善了用户体验,还缓解了依赖手势、头部运动和声学侧信道的传统按键推理攻击。然而,这一进步却矛盾地催生了一种新的、可能更为隐蔽的网络威胁--GAZEploit。在本文中,我们揭示了 GAZEploit,这是一种基于眼动跟踪的新型攻击,专门设计用于利用 VR 应用程序中常见的虚拟外观来利用这些眼动跟踪信息。与现有方法相比,这种广泛应用大大增强了我们攻击的实用性和可行性。GAZEploit 利用这一漏洞远程提取注视估计值,并窃取各种输入场景中的敏感按键信息--包括信息、密码、URL、电子邮件和密码。我们的研究有 30 人参与,按键推断准确率超过 80%。令人震惊的是,我们的研究还发现苹果商店中超过 15 款顶级应用程序容易受到 GAZEploit 攻击,这强调了对这种最先进的 VR/MR 文本输入方法加强安全措施的迫切需要。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
GAZEploit: Remote Keystroke Inference Attack by Gaze Estimation from Avatar Views in VR/MR Devices
The advent and growing popularity of Virtual Reality (VR) and Mixed Reality (MR) solutions have revolutionized the way we interact with digital platforms. The cutting-edge gaze-controlled typing methods, now prevalent in high-end models of these devices, e.g., Apple Vision Pro, have not only improved user experience but also mitigated traditional keystroke inference attacks that relied on hand gestures, head movements and acoustic side-channels. However, this advancement has paradoxically given birth to a new, potentially more insidious cyber threat, GAZEploit. In this paper, we unveil GAZEploit, a novel eye-tracking based attack specifically designed to exploit these eye-tracking information by leveraging the common use of virtual appearances in VR applications. This widespread usage significantly enhances the practicality and feasibility of our attack compared to existing methods. GAZEploit takes advantage of this vulnerability to remotely extract gaze estimations and steal sensitive keystroke information across various typing scenarios-including messages, passwords, URLs, emails, and passcodes. Our research, involving 30 participants, achieved over 80% accuracy in keystroke inference. Alarmingly, our study also identified over 15 top-rated apps in the Apple Store as vulnerable to the GAZEploit attack, emphasizing the urgent need for bolstered security measures for this state-of-the-art VR/MR text entry method.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信