Ning Guo, Xudong Han, Shuqiao Zhong, Zhiyuan Zhou, Jian Lin, Fang Wan, Chaoyang Song
{"title":"通过指内视觉重建柔软机器人触感","authors":"Ning Guo, Xudong Han, Shuqiao Zhong, Zhiyuan Zhou, Jian Lin, Fang Wan, Chaoyang Song","doi":"10.1002/aisy.202470045","DOIUrl":null,"url":null,"abstract":"<p><b>Reconstructing Soft Robotic Touch via In-Finger Vision</b>\n </p><p>The research by Fang Wan, Chaoyang Song, and co-workers (see article number 2400022) introduces a vision-based approach for learning proprioceptive interactions using Soft Robotic Metamaterials (SRMs). By reconstructing shape and touch during physical engagements, the authors achieve real-time, precise estimations of the soft finger mesh deformation in virtual environments. This innovation enhances the adaptability in 3D interactions and suggests promising applications in human–robot collaboration and touch-based digital twin interactions, bridging the gap between physical and virtual worlds via a multi-modal soft touch.\n\n <figure>\n <div><picture>\n <source></source></picture><p></p>\n </div>\n </figure></p>","PeriodicalId":93858,"journal":{"name":"Advanced intelligent systems (Weinheim an der Bergstrasse, Germany)","volume":"6 10","pages":""},"PeriodicalIF":6.8000,"publicationDate":"2024-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/aisy.202470045","citationCount":"0","resultStr":"{\"title\":\"Reconstructing Soft Robotic Touch via In-Finger Vision\",\"authors\":\"Ning Guo, Xudong Han, Shuqiao Zhong, Zhiyuan Zhou, Jian Lin, Fang Wan, Chaoyang Song\",\"doi\":\"10.1002/aisy.202470045\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><b>Reconstructing Soft Robotic Touch via In-Finger Vision</b>\\n </p><p>The research by Fang Wan, Chaoyang Song, and co-workers (see article number 2400022) introduces a vision-based approach for learning proprioceptive interactions using Soft Robotic Metamaterials (SRMs). By reconstructing shape and touch during physical engagements, the authors achieve real-time, precise estimations of the soft finger mesh deformation in virtual environments. This innovation enhances the adaptability in 3D interactions and suggests promising applications in human–robot collaboration and touch-based digital twin interactions, bridging the gap between physical and virtual worlds via a multi-modal soft touch.\\n\\n <figure>\\n <div><picture>\\n <source></source></picture><p></p>\\n </div>\\n </figure></p>\",\"PeriodicalId\":93858,\"journal\":{\"name\":\"Advanced intelligent systems (Weinheim an der Bergstrasse, Germany)\",\"volume\":\"6 10\",\"pages\":\"\"},\"PeriodicalIF\":6.8000,\"publicationDate\":\"2024-10-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1002/aisy.202470045\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Advanced intelligent systems (Weinheim an der Bergstrasse, Germany)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/aisy.202470045\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AUTOMATION & CONTROL SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advanced intelligent systems (Weinheim an der Bergstrasse, Germany)","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/aisy.202470045","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
Reconstructing Soft Robotic Touch via In-Finger Vision
Reconstructing Soft Robotic Touch via In-Finger Vision
The research by Fang Wan, Chaoyang Song, and co-workers (see article number 2400022) introduces a vision-based approach for learning proprioceptive interactions using Soft Robotic Metamaterials (SRMs). By reconstructing shape and touch during physical engagements, the authors achieve real-time, precise estimations of the soft finger mesh deformation in virtual environments. This innovation enhances the adaptability in 3D interactions and suggests promising applications in human–robot collaboration and touch-based digital twin interactions, bridging the gap between physical and virtual worlds via a multi-modal soft touch.