Juncheng Zou, Weihua Yin, E. X. Wang, Jiancheng Wang, Yan-Feng Lu
{"title":"Human Motion Prediction Based on Visual Tracking","authors":"Juncheng Zou, Weihua Yin, E. X. Wang, Jiancheng Wang, Yan-Feng Lu","doi":"10.1109/ICRAE48301.2019.9043816","DOIUrl":null,"url":null,"abstract":"Following and moving according to human motion is an important task for mobile robots. To ensure more compliant motion planning and execution of mobile robots, they can not be controlled by real-time visual information. In the tracking process, robots need to recognize and track the target then, plan the motion and execute the motion. In practical applications, the environments are very complex, such as illumination, shadows and occlusion, which the traditional visual tracking algorithms often deviate or lose the target. Therefore, to achieve fast human motion tracking, it is necessary to predict human motions by video prediction. In this paper, we propose a human motion tracking algorithm of mobile robot based on video prediciton. (1) The multi-layer generation adversarial loop network, which trained by off-line video dataset and learn how to predict human motion. (2) We used the pre-trained model to predict the state of the tracking target in the video. (3) This video prediction model was integrated into human-to-robot tracking algorithms of mobile robot following human system. Experiments show that the proposed algorithm can track human motion at a certain speed and precision.","PeriodicalId":270665,"journal":{"name":"2019 4th International Conference on Robotics and Automation Engineering (ICRAE)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 4th International Conference on Robotics and Automation Engineering (ICRAE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICRAE48301.2019.9043816","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Following and moving according to human motion is an important task for mobile robots. To ensure more compliant motion planning and execution of mobile robots, they can not be controlled by real-time visual information. In the tracking process, robots need to recognize and track the target then, plan the motion and execute the motion. In practical applications, the environments are very complex, such as illumination, shadows and occlusion, which the traditional visual tracking algorithms often deviate or lose the target. Therefore, to achieve fast human motion tracking, it is necessary to predict human motions by video prediction. In this paper, we propose a human motion tracking algorithm of mobile robot based on video prediciton. (1) The multi-layer generation adversarial loop network, which trained by off-line video dataset and learn how to predict human motion. (2) We used the pre-trained model to predict the state of the tracking target in the video. (3) This video prediction model was integrated into human-to-robot tracking algorithms of mobile robot following human system. Experiments show that the proposed algorithm can track human motion at a certain speed and precision.