Zhao-Yuan Wan, Yi-Xing Liu, Xiaochen Zhang, Ruoli Wang
{"title":"一种用于同步注视和运动分析的集成眼睛跟踪和运动捕捉系统。","authors":"Zhao-Yuan Wan, Yi-Xing Liu, Xiaochen Zhang, Ruoli Wang","doi":"10.1109/ICORR58425.2023.10304692","DOIUrl":null,"url":null,"abstract":"<p><p>Integrating mobile eye-tracking and motion capture emerges as a promising approach in studying visual-motor coordination, due to its capability of expressing gaze data within the same laboratory-centered coordinate system as body movement data. In this paper, we proposed an integrated eye-tracking and motion capture system, which can record and analyze temporally and spatially synchronized gaze and motion data during dynamic movement. The accuracy of gaze measurement were evaluated on five participants while they were instructed to view fixed vision targets at different distances while standing still or walking towards the targets. Similar accuracy could be achieved in both static and dynamic conditions. To demonstrate the usability of the integrated system, several walking tasks were performed in three different pathways. Results revealed that participants tended to focus their gaze on the upcoming path, especially on the downward path, possibly for better navigation and planning. In a more complex pathway, coupled with more gaze time on the pathway, participants were also found having the longest step time and shortest step length, which led to the lowest walking speed. It was believed that the integration of eye-tracking and motion capture is a feasible and promising methodology quantifying visual-motor coordination in locomotion.</p>","PeriodicalId":73276,"journal":{"name":"IEEE ... International Conference on Rehabilitation Robotics : [proceedings]","volume":"2023 ","pages":"1-6"},"PeriodicalIF":0.0000,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An Integrated Eye-Tracking and Motion Capture System in Synchronized Gaze and Movement Analysis.\",\"authors\":\"Zhao-Yuan Wan, Yi-Xing Liu, Xiaochen Zhang, Ruoli Wang\",\"doi\":\"10.1109/ICORR58425.2023.10304692\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Integrating mobile eye-tracking and motion capture emerges as a promising approach in studying visual-motor coordination, due to its capability of expressing gaze data within the same laboratory-centered coordinate system as body movement data. In this paper, we proposed an integrated eye-tracking and motion capture system, which can record and analyze temporally and spatially synchronized gaze and motion data during dynamic movement. The accuracy of gaze measurement were evaluated on five participants while they were instructed to view fixed vision targets at different distances while standing still or walking towards the targets. Similar accuracy could be achieved in both static and dynamic conditions. To demonstrate the usability of the integrated system, several walking tasks were performed in three different pathways. Results revealed that participants tended to focus their gaze on the upcoming path, especially on the downward path, possibly for better navigation and planning. In a more complex pathway, coupled with more gaze time on the pathway, participants were also found having the longest step time and shortest step length, which led to the lowest walking speed. It was believed that the integration of eye-tracking and motion capture is a feasible and promising methodology quantifying visual-motor coordination in locomotion.</p>\",\"PeriodicalId\":73276,\"journal\":{\"name\":\"IEEE ... International Conference on Rehabilitation Robotics : [proceedings]\",\"volume\":\"2023 \",\"pages\":\"1-6\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE ... International Conference on Rehabilitation Robotics : [proceedings]\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICORR58425.2023.10304692\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE ... International Conference on Rehabilitation Robotics : [proceedings]","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICORR58425.2023.10304692","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An Integrated Eye-Tracking and Motion Capture System in Synchronized Gaze and Movement Analysis.
Integrating mobile eye-tracking and motion capture emerges as a promising approach in studying visual-motor coordination, due to its capability of expressing gaze data within the same laboratory-centered coordinate system as body movement data. In this paper, we proposed an integrated eye-tracking and motion capture system, which can record and analyze temporally and spatially synchronized gaze and motion data during dynamic movement. The accuracy of gaze measurement were evaluated on five participants while they were instructed to view fixed vision targets at different distances while standing still or walking towards the targets. Similar accuracy could be achieved in both static and dynamic conditions. To demonstrate the usability of the integrated system, several walking tasks were performed in three different pathways. Results revealed that participants tended to focus their gaze on the upcoming path, especially on the downward path, possibly for better navigation and planning. In a more complex pathway, coupled with more gaze time on the pathway, participants were also found having the longest step time and shortest step length, which led to the lowest walking speed. It was believed that the integration of eye-tracking and motion capture is a feasible and promising methodology quantifying visual-motor coordination in locomotion.