Xuan Zhao, Sakmongkon Chumkamon, Shuangda Duan, Juan Rojas, Jia Pan
{"title":"基于LSTM-RNN的人机协同运动生成","authors":"Xuan Zhao, Sakmongkon Chumkamon, Shuangda Duan, Juan Rojas, Jia Pan","doi":"10.1109/HUMANOIDS.2018.8625068","DOIUrl":null,"url":null,"abstract":"We propose a deep learning based method for fast and responsive human-robot handovers that generate robot motion according to human motion observations. Our method learns an offline human-robot interaction model through a Recurrent Neural Network with Long Short-Term Memory units (LSTM-RNN). The robot uses the learned network to respond appropriately to novel online human motions. Our method is tested both on pre-recorded data and real-world human-robot handover experiments. Our method achieves robot motion accuracies that outperform the baseline. In addition, our method demonstrates a strong ability to adapt to changes in velocity of human motions.","PeriodicalId":433345,"journal":{"name":"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":"{\"title\":\"Collaborative Human-Robot Motion Generation Using LSTM-RNN\",\"authors\":\"Xuan Zhao, Sakmongkon Chumkamon, Shuangda Duan, Juan Rojas, Jia Pan\",\"doi\":\"10.1109/HUMANOIDS.2018.8625068\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We propose a deep learning based method for fast and responsive human-robot handovers that generate robot motion according to human motion observations. Our method learns an offline human-robot interaction model through a Recurrent Neural Network with Long Short-Term Memory units (LSTM-RNN). The robot uses the learned network to respond appropriately to novel online human motions. Our method is tested both on pre-recorded data and real-world human-robot handover experiments. Our method achieves robot motion accuracies that outperform the baseline. In addition, our method demonstrates a strong ability to adapt to changes in velocity of human motions.\",\"PeriodicalId\":433345,\"journal\":{\"name\":\"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)\",\"volume\":\"28 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"16\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/HUMANOIDS.2018.8625068\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HUMANOIDS.2018.8625068","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Collaborative Human-Robot Motion Generation Using LSTM-RNN
We propose a deep learning based method for fast and responsive human-robot handovers that generate robot motion according to human motion observations. Our method learns an offline human-robot interaction model through a Recurrent Neural Network with Long Short-Term Memory units (LSTM-RNN). The robot uses the learned network to respond appropriately to novel online human motions. Our method is tested both on pre-recorded data and real-world human-robot handover experiments. Our method achieves robot motion accuracies that outperform the baseline. In addition, our method demonstrates a strong ability to adapt to changes in velocity of human motions.