{"title":"Adaptive navigation assistance based on eye movement features in virtual reality","authors":"Song Zhao, Shiwei Cheng","doi":"10.1016/j.vrih.2022.07.003","DOIUrl":null,"url":null,"abstract":"<div><h3>Background</h3><p>Navigation assistance is very important for users when roaming in virtual reality scenes, however, the traditional navigation method requires users to manually request a map for viewing, which leads to low immersion and poor user experience.</p></div><div><h3>Methods</h3><p>To address this issue, first, we collected data when users need navigation assistance in a virtual reality environment, including various eye movement features such as gaze fixation, pupil size, and gaze angle, etc. After that, we used the Boostingbased XGBoost algorithm to train a prediction model, and finally used it to predict whether users need navigation assistance in a roaming task.</p></div><div><h3>Results</h3><p>After evaluating the performance of the model, the accuracy, precision, recall, and F1-score of our model reached about 95%. In addition, by applying the model to a virtual reality scene, an adaptive navigation assistance system based on the user’s real-time eye movement data was implemented.</p></div><div><h3>Conclusions</h3><p>Compared with traditional navigation assistance methods, our new adaptive navigation assistance could enable the user to be more immersive and effective during roaming in VR environment.</p></div>","PeriodicalId":33538,"journal":{"name":"Virtual Reality Intelligent Hardware","volume":"5 3","pages":"Pages 232-248"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Virtual Reality Intelligent Hardware","FirstCategoryId":"1093","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S209657962200064X","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 0
Abstract
Background
Navigation assistance is very important for users when roaming in virtual reality scenes, however, the traditional navigation method requires users to manually request a map for viewing, which leads to low immersion and poor user experience.
Methods
To address this issue, first, we collected data when users need navigation assistance in a virtual reality environment, including various eye movement features such as gaze fixation, pupil size, and gaze angle, etc. After that, we used the Boostingbased XGBoost algorithm to train a prediction model, and finally used it to predict whether users need navigation assistance in a roaming task.
Results
After evaluating the performance of the model, the accuracy, precision, recall, and F1-score of our model reached about 95%. In addition, by applying the model to a virtual reality scene, an adaptive navigation assistance system based on the user’s real-time eye movement data was implemented.
Conclusions
Compared with traditional navigation assistance methods, our new adaptive navigation assistance could enable the user to be more immersive and effective during roaming in VR environment.