{"title":"具有环境感知和人机交互能力的多智能体移动机器人系统","authors":"M. Tornow, A. Al-Hamadi, Vinzenz Borrmann","doi":"10.1109/ICSIPA.2013.6708013","DOIUrl":null,"url":null,"abstract":"A multi-agent robot system can speed up exploration or search and rescue operations in dangerous environments by working as a distributed sensor network. Each robot (e.g. Eddi Robot) equipped with a combined 2D/3D sensor (MS Kinect) and additional sensors needs to efficiently exchange its collected data with the other group members for task planning. For environment perception a 2D/3D panorama is generated from a sequence of images which were obtained while the robot was rotating. Furthermore the 2D/3D sensor data is used for a Human-Machine Interaction based on hand postures and gestures. The hand posture classification is realized by an Artificial Neural Network (ANN) which is processing a feature vector composed of Cosine-Descriptors (COD), Hu-moments and geometric features extracted of the hand shape. The System achieves an overall classification rate of more than 93%. It is used within the hand posture and gesture based human machine interface to control the robot team.","PeriodicalId":440373,"journal":{"name":"2013 IEEE International Conference on Signal and Image Processing Applications","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"A multi-agent mobile robot system with environment perception and HMI capabilities\",\"authors\":\"M. Tornow, A. Al-Hamadi, Vinzenz Borrmann\",\"doi\":\"10.1109/ICSIPA.2013.6708013\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A multi-agent robot system can speed up exploration or search and rescue operations in dangerous environments by working as a distributed sensor network. Each robot (e.g. Eddi Robot) equipped with a combined 2D/3D sensor (MS Kinect) and additional sensors needs to efficiently exchange its collected data with the other group members for task planning. For environment perception a 2D/3D panorama is generated from a sequence of images which were obtained while the robot was rotating. Furthermore the 2D/3D sensor data is used for a Human-Machine Interaction based on hand postures and gestures. The hand posture classification is realized by an Artificial Neural Network (ANN) which is processing a feature vector composed of Cosine-Descriptors (COD), Hu-moments and geometric features extracted of the hand shape. The System achieves an overall classification rate of more than 93%. It is used within the hand posture and gesture based human machine interface to control the robot team.\",\"PeriodicalId\":440373,\"journal\":{\"name\":\"2013 IEEE International Conference on Signal and Image Processing Applications\",\"volume\":\"20 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2013 IEEE International Conference on Signal and Image Processing Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICSIPA.2013.6708013\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 IEEE International Conference on Signal and Image Processing Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICSIPA.2013.6708013","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A multi-agent mobile robot system with environment perception and HMI capabilities
A multi-agent robot system can speed up exploration or search and rescue operations in dangerous environments by working as a distributed sensor network. Each robot (e.g. Eddi Robot) equipped with a combined 2D/3D sensor (MS Kinect) and additional sensors needs to efficiently exchange its collected data with the other group members for task planning. For environment perception a 2D/3D panorama is generated from a sequence of images which were obtained while the robot was rotating. Furthermore the 2D/3D sensor data is used for a Human-Machine Interaction based on hand postures and gestures. The hand posture classification is realized by an Artificial Neural Network (ANN) which is processing a feature vector composed of Cosine-Descriptors (COD), Hu-moments and geometric features extracted of the hand shape. The System achieves an overall classification rate of more than 93%. It is used within the hand posture and gesture based human machine interface to control the robot team.