Walking Balance Assessment with Eye-tracking and Spatial Data Visualization

Zhu Wang, Anat V. Lubetzky, K. Perlin
{"title":"Walking Balance Assessment with Eye-tracking and Spatial Data Visualization","authors":"Zhu Wang, Anat V. Lubetzky, K. Perlin","doi":"10.1145/3450615.3464533","DOIUrl":null,"url":null,"abstract":"Virtual Reality (VR) based assessment systems can simulate diverse real-life scenarios and help clinicians assess participants’ performance under controlled functional contexts. Our previous work demonstrated an assessment paradigm to provide multi-sensory stimuli and cognitive load, and quantify walking balance with obstacle negotiation by motion capture and pressure sensing. However, we need to fill two gaps to make it more clinically relevant: 1. it required offline complex data processing with external statistical analysis software; 2. it utilized motion tracking but overlooked eye movement. Therefore, we present a novel walking balance assessment system with eye tracking to investigate the role of eye movement in walking balance and spatial data visualization to better interpret and understand the experimental data. The spatial visualization includes instantaneous in-situ VR replay for the gaze, head, and feet; and data plots for the outcome measures. The system fills a need to provide eye tracking and intuitive feedback in VR to experimenters, clinicians, and participants in real-time.","PeriodicalId":439895,"journal":{"name":"ACM SIGGRAPH 2021 Immersive Pavilion","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM SIGGRAPH 2021 Immersive Pavilion","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3450615.3464533","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Virtual Reality (VR) based assessment systems can simulate diverse real-life scenarios and help clinicians assess participants’ performance under controlled functional contexts. Our previous work demonstrated an assessment paradigm to provide multi-sensory stimuli and cognitive load, and quantify walking balance with obstacle negotiation by motion capture and pressure sensing. However, we need to fill two gaps to make it more clinically relevant: 1. it required offline complex data processing with external statistical analysis software; 2. it utilized motion tracking but overlooked eye movement. Therefore, we present a novel walking balance assessment system with eye tracking to investigate the role of eye movement in walking balance and spatial data visualization to better interpret and understand the experimental data. The spatial visualization includes instantaneous in-situ VR replay for the gaze, head, and feet; and data plots for the outcome measures. The system fills a need to provide eye tracking and intuitive feedback in VR to experimenters, clinicians, and participants in real-time.
基于眼动追踪和空间数据可视化的行走平衡评估
基于虚拟现实(VR)的评估系统可以模拟不同的现实生活场景,帮助临床医生评估参与者在受控功能环境下的表现。我们之前的工作展示了一种评估范式,提供多感官刺激和认知负荷,并通过运动捕捉和压力传感量化行走平衡与障碍协商。然而,我们需要填补两个空白,使其更具临床相关性:1。需要借助外部统计分析软件离线处理复杂数据;2. 它利用了运动追踪,但忽略了眼球运动。因此,我们提出了一种新的眼动追踪行走平衡评估系统,以研究眼动在行走平衡中的作用,并将实验数据可视化,以便更好地解释和理解实验数据。空间可视化包括视线、头部和脚的即时现场VR回放;结果测量的数据图。该系统满足了向实验人员、临床医生和参与者实时提供眼动追踪和直观反馈的需求。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信