Huaping Liu, Di Guo, F. Sun, Wuqiang Yang, S. Furber, Teng Sun
{"title":"Embodied tactile perception and learning","authors":"Huaping Liu, Di Guo, F. Sun, Wuqiang Yang, S. Furber, Teng Sun","doi":"10.26599/BSA.2020.9050012","DOIUrl":null,"url":null,"abstract":"Various living creatures exhibit embodiment intelligence, which is reflected by a collaborative interaction of the brain, body, and environment. The actual behavior of embodiment intelligence is generated by a continuous and dynamic interaction between a subject and the environment through information perception and physical manipulation. The physical interaction between a robot and the environment is the basis for realizing embodied perception and learning. Tactile information plays a critical role in this physical interaction process. It can be used to ensure safety, stability, and compliance, and can provide unique information that is difficult to capture using other perception modalities. However, due to the limitations of existing sensors and perception and learning methods, the development of robotic tactile research lags significantly behind other sensing modalities, such as vision and hearing, thereby seriously restricting the development of robotic embodiment intelligence. This paper presents the current challenges related to robotic tactile embodiment intelligence and reviews the theory and methods of robotic embodied tactile intelligence. Tactile perception and learning methods for embodiment intelligence can be designed based on the development of new large‐scale tactile array sensing devices, with the aim to make breakthroughs in the neuromorphic computing technology of tactile intelligence.","PeriodicalId":67062,"journal":{"name":"Brain Science Advances","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.26599/BSA.2020.9050012","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Brain Science Advances","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.26599/BSA.2020.9050012","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Various living creatures exhibit embodiment intelligence, which is reflected by a collaborative interaction of the brain, body, and environment. The actual behavior of embodiment intelligence is generated by a continuous and dynamic interaction between a subject and the environment through information perception and physical manipulation. The physical interaction between a robot and the environment is the basis for realizing embodied perception and learning. Tactile information plays a critical role in this physical interaction process. It can be used to ensure safety, stability, and compliance, and can provide unique information that is difficult to capture using other perception modalities. However, due to the limitations of existing sensors and perception and learning methods, the development of robotic tactile research lags significantly behind other sensing modalities, such as vision and hearing, thereby seriously restricting the development of robotic embodiment intelligence. This paper presents the current challenges related to robotic tactile embodiment intelligence and reviews the theory and methods of robotic embodied tactile intelligence. Tactile perception and learning methods for embodiment intelligence can be designed based on the development of new large‐scale tactile array sensing devices, with the aim to make breakthroughs in the neuromorphic computing technology of tactile intelligence.