G. Asuni, G. Teti, C. Laschi, Eugenio Guglielmelli, Paolo Dario
{"title":"神经机器人操作平台的仿生感觉-运动神经模型","authors":"G. Asuni, G. Teti, C. Laschi, Eugenio Guglielmelli, Paolo Dario","doi":"10.1109/ICAR.2005.1507471","DOIUrl":null,"url":null,"abstract":"This paper presents a neural model for visuo-motor coordination of a redundant robotic manipulator in reaching tasks. The model was developed for, and experimentally validated on, a neurobotic platform for manipulation. The proposed approach is based on a biologically-inspired model, which replicates the human brain capability of creating associations between motor and sensory data, by learning. The model is implemented here by self-organizing neural maps. During learning, the system creates relations between the motor data associated to endogenous movements performed by the robotic arm and the sensory consequences of such motor actions, i.e. the final position of the end effector. The learnt relations are stored in the neural map structure and are then used, after learning, for generating motor commands aimed at reaching a given point in 3D space. The approach proposed here allows to solve the inverse kinematics and joint redundancy problems for different robotic arms, with good accuracy and robustness. In order to validate this, the same implementation has been tested on a PUMA robot, too. Experimental trials confirmed the system capability to control the end effector position and also to manage the redundancy of the robotic manipulator in reaching the 3D target point even with additional constraints, such as one or more clamped joints, tools of variable lengths, or no visual feedback, without additional learning phases","PeriodicalId":428475,"journal":{"name":"ICAR '05. Proceedings., 12th International Conference on Advanced Robotics, 2005.","volume":"73 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2005-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"A bio-inspired sensory-motor neural model for a neuro-robotic manipulation platform\",\"authors\":\"G. Asuni, G. Teti, C. Laschi, Eugenio Guglielmelli, Paolo Dario\",\"doi\":\"10.1109/ICAR.2005.1507471\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents a neural model for visuo-motor coordination of a redundant robotic manipulator in reaching tasks. The model was developed for, and experimentally validated on, a neurobotic platform for manipulation. The proposed approach is based on a biologically-inspired model, which replicates the human brain capability of creating associations between motor and sensory data, by learning. The model is implemented here by self-organizing neural maps. During learning, the system creates relations between the motor data associated to endogenous movements performed by the robotic arm and the sensory consequences of such motor actions, i.e. the final position of the end effector. The learnt relations are stored in the neural map structure and are then used, after learning, for generating motor commands aimed at reaching a given point in 3D space. The approach proposed here allows to solve the inverse kinematics and joint redundancy problems for different robotic arms, with good accuracy and robustness. In order to validate this, the same implementation has been tested on a PUMA robot, too. Experimental trials confirmed the system capability to control the end effector position and also to manage the redundancy of the robotic manipulator in reaching the 3D target point even with additional constraints, such as one or more clamped joints, tools of variable lengths, or no visual feedback, without additional learning phases\",\"PeriodicalId\":428475,\"journal\":{\"name\":\"ICAR '05. Proceedings., 12th International Conference on Advanced Robotics, 2005.\",\"volume\":\"73 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2005-07-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ICAR '05. Proceedings., 12th International Conference on Advanced Robotics, 2005.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICAR.2005.1507471\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ICAR '05. Proceedings., 12th International Conference on Advanced Robotics, 2005.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAR.2005.1507471","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A bio-inspired sensory-motor neural model for a neuro-robotic manipulation platform
This paper presents a neural model for visuo-motor coordination of a redundant robotic manipulator in reaching tasks. The model was developed for, and experimentally validated on, a neurobotic platform for manipulation. The proposed approach is based on a biologically-inspired model, which replicates the human brain capability of creating associations between motor and sensory data, by learning. The model is implemented here by self-organizing neural maps. During learning, the system creates relations between the motor data associated to endogenous movements performed by the robotic arm and the sensory consequences of such motor actions, i.e. the final position of the end effector. The learnt relations are stored in the neural map structure and are then used, after learning, for generating motor commands aimed at reaching a given point in 3D space. The approach proposed here allows to solve the inverse kinematics and joint redundancy problems for different robotic arms, with good accuracy and robustness. In order to validate this, the same implementation has been tested on a PUMA robot, too. Experimental trials confirmed the system capability to control the end effector position and also to manage the redundancy of the robotic manipulator in reaching the 3D target point even with additional constraints, such as one or more clamped joints, tools of variable lengths, or no visual feedback, without additional learning phases