{"title":"基于虚拟现实和眼动追踪的视觉感知分析与运动预测","authors":"Niklas Stein","doi":"10.1109/VRW52623.2021.00246","DOIUrl":null,"url":null,"abstract":"Locomotion and vison are closely linked. When users explore virtual environments by walking they rely on stable visible landmarks to plan and execute their next movement. In my research I am developing novel methods to predict locomotion paths of human subjects for the immediate future, i.e. the next few seconds. I aim to connect different types of behavioral data (eye, hand, feet and head tracking) and test their reliability and validity for predicting walking behavior in virtual reality. Such a prediction will be very valuable for natural interaction, for example in redirected walking schemes.My approach begins with an evaluation of the quality of data gathered with current tracking methods. Informative experimental conditions need to be developed to find meaningful patterns in natural walking. Next, raw tracked data of different modalities need to be connected with each other and aggregated in a useful way. Thereafter, possible valid predictors need to be developed and compared to already functioning predicting algorithms (e.g. [2],[6],[12]). As a final goal, all valid predictors shall be used to create a prediction algorithm returning the most likely future path when exploring virtual environments.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Analyzing Visual Perception and Predicting Locomotion using Virtual Reality and Eye Tracking\",\"authors\":\"Niklas Stein\",\"doi\":\"10.1109/VRW52623.2021.00246\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Locomotion and vison are closely linked. When users explore virtual environments by walking they rely on stable visible landmarks to plan and execute their next movement. In my research I am developing novel methods to predict locomotion paths of human subjects for the immediate future, i.e. the next few seconds. I aim to connect different types of behavioral data (eye, hand, feet and head tracking) and test their reliability and validity for predicting walking behavior in virtual reality. Such a prediction will be very valuable for natural interaction, for example in redirected walking schemes.My approach begins with an evaluation of the quality of data gathered with current tracking methods. Informative experimental conditions need to be developed to find meaningful patterns in natural walking. Next, raw tracked data of different modalities need to be connected with each other and aggregated in a useful way. Thereafter, possible valid predictors need to be developed and compared to already functioning predicting algorithms (e.g. [2],[6],[12]). As a final goal, all valid predictors shall be used to create a prediction algorithm returning the most likely future path when exploring virtual environments.\",\"PeriodicalId\":256204,\"journal\":{\"name\":\"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)\",\"volume\":\"29 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/VRW52623.2021.00246\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VRW52623.2021.00246","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Analyzing Visual Perception and Predicting Locomotion using Virtual Reality and Eye Tracking
Locomotion and vison are closely linked. When users explore virtual environments by walking they rely on stable visible landmarks to plan and execute their next movement. In my research I am developing novel methods to predict locomotion paths of human subjects for the immediate future, i.e. the next few seconds. I aim to connect different types of behavioral data (eye, hand, feet and head tracking) and test their reliability and validity for predicting walking behavior in virtual reality. Such a prediction will be very valuable for natural interaction, for example in redirected walking schemes.My approach begins with an evaluation of the quality of data gathered with current tracking methods. Informative experimental conditions need to be developed to find meaningful patterns in natural walking. Next, raw tracked data of different modalities need to be connected with each other and aggregated in a useful way. Thereafter, possible valid predictors need to be developed and compared to already functioning predicting algorithms (e.g. [2],[6],[12]). As a final goal, all valid predictors shall be used to create a prediction algorithm returning the most likely future path when exploring virtual environments.