Immersive virtual reality for learning exoskeleton-like virtual walking: a feasibility study.

IF 5.2 2区 医学 Q1 ENGINEERING, BIOMEDICAL
Antonio Rodríguez-Fernández, Alex van den Berg, Salvatore Luca Cucinella, Joan Lobo-Prat, Josep M Font-Llagunes, Laura Marchal-Crespo
{"title":"Immersive virtual reality for learning exoskeleton-like virtual walking: a feasibility study.","authors":"Antonio Rodríguez-Fernández, Alex van den Berg, Salvatore Luca Cucinella, Joan Lobo-Prat, Josep M Font-Llagunes, Laura Marchal-Crespo","doi":"10.1186/s12984-024-01482-y","DOIUrl":null,"url":null,"abstract":"<p><strong>Purpose: </strong>Virtual Reality (VR) has proven to be an effective tool for motor (re)learning. Furthermore, with the current commercialization of low-cost head-mounted displays (HMDs), immersive virtual reality (IVR) has become a viable rehabilitation tool. Nonetheless, it is still an open question how immersive virtual environments should be designed to enhance motor learning, especially to support the learning of complex motor tasks. An example of such a complex task is triggering steps while wearing lower-limb exoskeletons as it requires the learning of several sub-tasks, e.g., shifting the weight from one leg to the other, keeping the trunk upright, and initiating steps. This study aims to find the necessary elements in VR to promote motor learning of complex virtual gait tasks.</p><p><strong>Methods: </strong>In this study, we developed an HMD-IVR-based system for training to control wearable lower-limb exoskeletons for people with sensorimotor disorders. The system simulates a virtual walking task of an avatar resembling the sub-tasks needed to trigger steps with an exoskeleton. We ran an experiment with forty healthy participants to investigate the effects of first- (1PP) vs. third-person perspective (3PP) and the provision (or not) of concurrent visual feedback of participants' movements on the walking performance - namely number of steps, trunk inclination, and stride length -, as well as the effects on embodiment, usability, cybersickness, and perceived workload.</p><p><strong>Results: </strong>We found that all participants learned to execute the virtual walking task. However, no clear interaction of perspective and visual feedback improved the learning of all sub-tasks concurrently. Instead, the key seems to lie in selecting the appropriate perspective and visual feedback for each sub-task. Notably, participants embodied the avatar across all training modalities with low cybersickness levels. Still, participants' cognitive load remained high, leading to marginally acceptable usability scores.</p><p><strong>Conclusions: </strong>Our findings suggest that to maximize learning, users should train sub-tasks sequentially using the most suitable combination of person's perspective and visual feedback for each sub-task. This research offers valuable insights for future developments in IVR to support individuals with sensorimotor disorders in improving the learning of walking with wearable exoskeletons.</p>","PeriodicalId":16384,"journal":{"name":"Journal of NeuroEngineering and Rehabilitation","volume":null,"pages":null},"PeriodicalIF":5.2000,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of NeuroEngineering and Rehabilitation","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1186/s12984-024-01482-y","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Purpose: Virtual Reality (VR) has proven to be an effective tool for motor (re)learning. Furthermore, with the current commercialization of low-cost head-mounted displays (HMDs), immersive virtual reality (IVR) has become a viable rehabilitation tool. Nonetheless, it is still an open question how immersive virtual environments should be designed to enhance motor learning, especially to support the learning of complex motor tasks. An example of such a complex task is triggering steps while wearing lower-limb exoskeletons as it requires the learning of several sub-tasks, e.g., shifting the weight from one leg to the other, keeping the trunk upright, and initiating steps. This study aims to find the necessary elements in VR to promote motor learning of complex virtual gait tasks.

Methods: In this study, we developed an HMD-IVR-based system for training to control wearable lower-limb exoskeletons for people with sensorimotor disorders. The system simulates a virtual walking task of an avatar resembling the sub-tasks needed to trigger steps with an exoskeleton. We ran an experiment with forty healthy participants to investigate the effects of first- (1PP) vs. third-person perspective (3PP) and the provision (or not) of concurrent visual feedback of participants' movements on the walking performance - namely number of steps, trunk inclination, and stride length -, as well as the effects on embodiment, usability, cybersickness, and perceived workload.

Results: We found that all participants learned to execute the virtual walking task. However, no clear interaction of perspective and visual feedback improved the learning of all sub-tasks concurrently. Instead, the key seems to lie in selecting the appropriate perspective and visual feedback for each sub-task. Notably, participants embodied the avatar across all training modalities with low cybersickness levels. Still, participants' cognitive load remained high, leading to marginally acceptable usability scores.

Conclusions: Our findings suggest that to maximize learning, users should train sub-tasks sequentially using the most suitable combination of person's perspective and visual feedback for each sub-task. This research offers valuable insights for future developments in IVR to support individuals with sensorimotor disorders in improving the learning of walking with wearable exoskeletons.

沉浸式虚拟现实技术用于学习类似外骨骼的虚拟行走:一项可行性研究。
目的:虚拟现实(VR)已被证明是运动(再)学习的有效工具。此外,随着目前低成本头戴式显示器(HMD)的商业化,沉浸式虚拟现实(IVR)已成为一种可行的康复工具。然而,如何设计沉浸式虚拟环境来加强运动学习,尤其是支持复杂运动任务的学习,仍然是一个未决问题。穿戴下肢外骨骼时触发台阶就是此类复杂任务的一个例子,因为它需要学习多个子任务,如将重心从一条腿转移到另一条腿、保持躯干直立和启动台阶。本研究旨在找到促进复杂虚拟步态任务的运动学习所需的虚拟现实元素:在这项研究中,我们开发了一个基于 HMD-IVR 的系统,用于训练感知运动障碍患者控制可穿戴式下肢外骨骼。该系统模拟了一个虚拟化身的行走任务,类似于使用外骨骼触发步法所需的子任务。我们对 40 名健康参与者进行了实验,以研究第一人称视角(1PP)与第三人称视角(3PP)以及提供(或不提供)参与者动作的并行视觉反馈对步行表现(即步数、躯干倾斜度和步长)的影响,以及对体现、可用性、晕机和感知工作量的影响:我们发现,所有参与者都学会了执行虚拟行走任务。结果:我们发现,所有参与者都学会了执行虚拟行走任务。然而,视角和视觉反馈之间并没有明显的交互作用,无法同时提高所有子任务的学习效果。相反,关键似乎在于为每个子任务选择适当的视角和视觉反馈。值得注意的是,在所有训练模式中,参与者都化身为化身,晕机程度很低。尽管如此,参与者的认知负荷仍然很高,导致可用性得分勉强可以接受:我们的研究结果表明,为了最大限度地提高学习效果,用户应该使用最合适的个人视角和视觉反馈组合,对每个子任务进行顺序训练。这项研究为 IVR 的未来发展提供了有价值的见解,可帮助患有感知运动障碍的人提高使用可穿戴外骨骼行走的学习能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of NeuroEngineering and Rehabilitation
Journal of NeuroEngineering and Rehabilitation 工程技术-工程:生物医学
CiteScore
9.60
自引率
3.90%
发文量
122
审稿时长
24 months
期刊介绍: Journal of NeuroEngineering and Rehabilitation considers manuscripts on all aspects of research that result from cross-fertilization of the fields of neuroscience, biomedical engineering, and physical medicine & rehabilitation.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信