Seong-Min Im, Byeong-Sun Park, Jaehwan Jang, Sungeun Hong, Changjoo Nam, Young Tack Lee, Min-gu Kim
{"title":"Simultaneous In-Hand Shape and Temperature Recognition Using Flexible Multilayered Sensor Arrays for Sense-Based Robot Manipulation","authors":"Seong-Min Im, Byeong-Sun Park, Jaehwan Jang, Sungeun Hong, Changjoo Nam, Young Tack Lee, Min-gu Kim","doi":"10.1002/adsr.70004","DOIUrl":null,"url":null,"abstract":"<p>Artificial tactile systems play a pivotal role in advancing human-machine interaction technology by enabling precise physical interaction with objects and environments. Tactile information, such as pressure and temperature, allows robots to manipulate objects accurately and interact safely with humans. To facilitate this, a robotic skin integrating flexible pressure and temperature sensor arrays has been developed. The capacitive pressure sensor, inspired by human skin and utilizing a micro-dome structure, demonstrates fast, stable, and sensitive performance under applied pressure. Also, the resistive temperature sensor, based on reduced graphene oxide, exhibits highly sensitive responses to temperature changes, characterized by rapid and linear behavior. These sensors are vertically integrated into a multilayered system capable of simultaneously detecting real-time pressure and temperature distribution. This integrated sensor system, when incorporated into a robotic gripper, enables accurate identification of object shapes and surface temperatures during manipulation tasks. By pairing the sensor system with a camera that captures macroscopic visual information, including areas not directly visible, robots achieve enhanced manipulation capabilities through the synergy of visual context and detailed tactile input. This development represents a fundamental technology for multimodal tactile recognition and highlights its potential applications in artificial intelligence-driven visual-tactile fusion technologies.</p>","PeriodicalId":100037,"journal":{"name":"Advanced Sensor Research","volume":"4 7","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2025-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/adsr.70004","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advanced Sensor Research","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/adsr.70004","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Artificial tactile systems play a pivotal role in advancing human-machine interaction technology by enabling precise physical interaction with objects and environments. Tactile information, such as pressure and temperature, allows robots to manipulate objects accurately and interact safely with humans. To facilitate this, a robotic skin integrating flexible pressure and temperature sensor arrays has been developed. The capacitive pressure sensor, inspired by human skin and utilizing a micro-dome structure, demonstrates fast, stable, and sensitive performance under applied pressure. Also, the resistive temperature sensor, based on reduced graphene oxide, exhibits highly sensitive responses to temperature changes, characterized by rapid and linear behavior. These sensors are vertically integrated into a multilayered system capable of simultaneously detecting real-time pressure and temperature distribution. This integrated sensor system, when incorporated into a robotic gripper, enables accurate identification of object shapes and surface temperatures during manipulation tasks. By pairing the sensor system with a camera that captures macroscopic visual information, including areas not directly visible, robots achieve enhanced manipulation capabilities through the synergy of visual context and detailed tactile input. This development represents a fundamental technology for multimodal tactile recognition and highlights its potential applications in artificial intelligence-driven visual-tactile fusion technologies.