IEEE transactions on medical robotics and bionics最新文献

筛选
英文 中文
A Lightweight Powered Elbow Exoskeleton for Manual Handling Tasks 用于手动搬运任务的轻型动力肘部外骨骼
IF 3.4
IEEE transactions on medical robotics and bionics Pub Date : 2024-09-20 DOI: 10.1109/TMRB.2024.3464690
Daniel Colley;Collin D. Bowersock;Zachary F. Lerner
{"title":"A Lightweight Powered Elbow Exoskeleton for Manual Handling Tasks","authors":"Daniel Colley;Collin D. Bowersock;Zachary F. Lerner","doi":"10.1109/TMRB.2024.3464690","DOIUrl":"https://doi.org/10.1109/TMRB.2024.3464690","url":null,"abstract":"This study introduces a novel lightweight elbow joint exoskeleton designed to enhance the safety and efficiency of industrial workers engaged in manual handling tasks. Our design leveraged a Bowden cable transmission system and a practical control strategy utilizing instrumented gloves to deliver reactive bi-directional support for dynamic box lifting and pressing activities. The primary focus of this work was to (1) to present an engineering validation analysis and (2) assess the exoskeleton’s impact on reducing muscle activity, increasing endurance, and maintaining overall user comfort during upper-extremity lifting or carrying tasks. We observed significant and consistent reductions in muscle activity and an increase in endurance (e.g., 2.4x more repetitions) during box lifting tasks, without compromising user comfort. These findings provide promising evidence of the exoskeleton’s effectiveness and represent a crucial first step working towards demonstrating efficacy in real-world workplace environments.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"6 4","pages":"1627-1636"},"PeriodicalIF":3.4,"publicationDate":"2024-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142600297","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
SlicerROS2: A Research and Development Module for Image-Guided Robotic Interventions SlicerROS2:图像引导机器人干预的研发模块
IF 3.4
IEEE transactions on medical robotics and bionics Pub Date : 2024-09-20 DOI: 10.1109/TMRB.2024.3464683
Laura Connolly;Aravind S. Kumar;Kapi Ketan Mehta;Lidia Al-Zogbi;Peter Kazanzides;Parvin Mousavi;Gabor Fichtinger;Axel Krieger;Junichi Tokuda;Russell H. Taylor;Simon Leonard;Anton Deguet
{"title":"SlicerROS2: A Research and Development Module for Image-Guided Robotic Interventions","authors":"Laura Connolly;Aravind S. Kumar;Kapi Ketan Mehta;Lidia Al-Zogbi;Peter Kazanzides;Parvin Mousavi;Gabor Fichtinger;Axel Krieger;Junichi Tokuda;Russell H. Taylor;Simon Leonard;Anton Deguet","doi":"10.1109/TMRB.2024.3464683","DOIUrl":"https://doi.org/10.1109/TMRB.2024.3464683","url":null,"abstract":"Image-guided robotic interventions involve the use of medical imaging in tandem with robotics. SlicerROS2 is a software module that combines 3D Slicer and robot operating system (ROS) in pursuit of a standard integration approach for medical robotics research. The first release of SlicerROS2 demonstrated the feasibility of using the C++ API from 3D Slicer and ROS to load and visualize robots in real time. Since this initial release, we’ve rewritten and redesigned the module to offer greater modularity, access to low-level features, access to 3D Slicer’s Python API, and better data transfer protocols. In this paper, we introduce this new design as well as four applications that leverage the core functionalities of SlicerROS2 in realistic image-guided robotics scenarios.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"6 4","pages":"1334-1344"},"PeriodicalIF":3.4,"publicationDate":"2024-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142600327","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Fast OCT-Based Needle Tracking for Retinal Microsurgery Using Dynamic Spiral Scanning 利用动态螺旋扫描进行基于光学视网膜显微手术的快速针跟踪
IF 3.4
IEEE transactions on medical robotics and bionics Pub Date : 2024-09-20 DOI: 10.1109/TMRB.2024.3464693
Pengwei Xu;Mouloud Ourak;Gianni Borghesan;Emmanuel Vander Poorten
{"title":"Fast OCT-Based Needle Tracking for Retinal Microsurgery Using Dynamic Spiral Scanning","authors":"Pengwei Xu;Mouloud Ourak;Gianni Borghesan;Emmanuel Vander Poorten","doi":"10.1109/TMRB.2024.3464693","DOIUrl":"https://doi.org/10.1109/TMRB.2024.3464693","url":null,"abstract":"Retinal microsurgery is crucial for treating various ocular diseases, but challenging due to the structure size, physiological tremor and limited depth perception. This study aims to develop an innovative real-time needle tracking system that utilizes only a small amount of Optical Coherence Tomography (OCT) A-scans. We introduce a spiral scanning pattern, that is dynamically updated to efficiently capture the needle tip and the retina area with 2000 A-scans. An imaging pipeline is proposed that initiates with an initial Region of Interest (ROI) identification, followed by image segmentation, 3D reconstruction, and needle pose estimation. The ROI is dynamically adjusted to keep the needle tip centrally within the spiral scan, facilitating tracking at clinically relevant speeds. Preliminary testing on phantom eye models demonstrated that our system can maintain an average tracking error of 0.04 mm in spatial coordinates and an error of 0.06 mm in estimating the distance between the needle tip and the retina. These results suggest the system’s potential to enhance surgical outcomes by providing surgeons with improved depth perception and precise, real-time feedback. By efficiently utilizing spirally sampled OCT data, this system sets the groundwork for future integrations of real-time 4D imaging and physiological motion detection capabilities.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"6 4","pages":"1502-1511"},"PeriodicalIF":3.4,"publicationDate":"2024-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142600318","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Advancements and Challenges in the Development of Robotic Lower Limb Prostheses: A Systematic Review 机器人下肢假肢开发的进展与挑战:系统回顾
IF 3.4
IEEE transactions on medical robotics and bionics Pub Date : 2024-09-19 DOI: 10.1109/TMRB.2024.3464126
Ilaria Fagioli;Alessandro Mazzarini;Chiara Livolsi;Emanuele Gruppioni;Nicola Vitiello;Simona Crea;Emilio Trigili
{"title":"Advancements and Challenges in the Development of Robotic Lower Limb Prostheses: A Systematic Review","authors":"Ilaria Fagioli;Alessandro Mazzarini;Chiara Livolsi;Emanuele Gruppioni;Nicola Vitiello;Simona Crea;Emilio Trigili","doi":"10.1109/TMRB.2024.3464126","DOIUrl":"https://doi.org/10.1109/TMRB.2024.3464126","url":null,"abstract":"Lower limb prosthetics, essential for restoring mobility in individuals with limb loss, have witnessed significant advancements in recent years. This systematic review reports the recent research advancements in the field of semi-active and active lower limb prostheses. The review focuses on the mechatronic features of the devices, the sensing and control strategies, and the performance verification with end-users. A total of 53 prosthetic prototypes were identified and analyzed, including 16 knee-ankle prostheses, 18 knee prostheses, and 19 ankle prostheses. The review highlights some of the open challenges in the field of prosthetic research.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"6 4","pages":"1409-1422"},"PeriodicalIF":3.4,"publicationDate":"2024-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10684266","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142600326","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Scorpion-Inspired 5-DOF Miniature Remote Actuation Robotic Endoscope for Minimally Invasive Surgery 用于微创手术的受蝎子启发的 5-DOF 微型遥控机器人内窥镜
IF 3.4
IEEE transactions on medical robotics and bionics Pub Date : 2024-09-19 DOI: 10.1109/TMRB.2024.3464114
Jixiu Li;Truman Cheng;Wai Shing Chan;Zixiao Chen;Yehui Li;Calvin Sze Hang Ng;Philip Wai Yan Chiu;Zheng Li
{"title":"A Scorpion-Inspired 5-DOF Miniature Remote Actuation Robotic Endoscope for Minimally Invasive Surgery","authors":"Jixiu Li;Truman Cheng;Wai Shing Chan;Zixiao Chen;Yehui Li;Calvin Sze Hang Ng;Philip Wai Yan Chiu;Zheng Li","doi":"10.1109/TMRB.2024.3464114","DOIUrl":"https://doi.org/10.1109/TMRB.2024.3464114","url":null,"abstract":"Remote Actuation Mechanisms (RAMs) play a vital role in minimally invasive surgery (MIS) by providing motion capabilities within limited spaces. This paper first focused on analyzing commonly employed RAMs to understand their strengths and limitations. Then, drawing inspiration from bionics and the biological structure of scorpions, we proposed a novel approach by integrating three RAMs-a magnet pair, a torque coil, and a soft bellow-to create a 5-degree-of-freedom (5-DOF) miniature remote actuation robot. In the design phase, we established the robot’s parameters using the magnetic dipole model and related constraints. A functional prototype of the robot, along with an external controller and user interface, was fabricated and assembled. Experimental investigations demonstrated motion performance across the 5 DOF, validating the robot’s feasibility. To assess the practicality of the system, the interaction interface was evaluated under controlled laboratory conditions and through a cadaver test. In conclusion, our innovative approach combines multiple RAMs into a 5-DOF remote actuation robot. Comprehensive tests validated its motion capabilities and highlighted its potential to advance MIS procedures.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"6 4","pages":"1748-1759"},"PeriodicalIF":3.4,"publicationDate":"2024-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142600312","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Hands Collaboration Evaluation for Surgical Skills Assessment: An Information Theoretical Approach 外科技能评估中的手部协作评估:信息理论方法
IF 3.4
IEEE transactions on medical robotics and bionics Pub Date : 2024-09-19 DOI: 10.1109/TMRB.2024.3464110
Abed Soleymani;Mahdi Tavakoli;Farzad Aghazadeh;Yafei Ou;Hossein Rouhani;Bin Zheng;Xingyu Li
{"title":"Hands Collaboration Evaluation for Surgical Skills Assessment: An Information Theoretical Approach","authors":"Abed Soleymani;Mahdi Tavakoli;Farzad Aghazadeh;Yafei Ou;Hossein Rouhani;Bin Zheng;Xingyu Li","doi":"10.1109/TMRB.2024.3464110","DOIUrl":"https://doi.org/10.1109/TMRB.2024.3464110","url":null,"abstract":"Bimanual tasks, where the brain must simultaneously control and plan the movements of both hands, such as needle passing and tissue cutting, commonly exist in surgeries, e.g., robot-assisted minimally invasive surgery. In this study, we present a novel approach for quantifying the quality of hands coordination and correspondence in bimanual tasks by utilizing information theory concepts to build a mathematical framework for measuring the collaboration strength between the two hands. The introduced method makes no assumption about the dynamics and couplings within the robotic platform, executive task, or human motor control. We implemented the proposed approach on MEELS and JIGSAWS datasets, corresponding to conventional minimally invasive surgery (MIS) and robot-assisted MIS, respectively. We analyzed the advantages of hands collaboration features in the skills assessment and style recognition of robotic surgery tasks. Furthermore, we demonstrated that incorporating intuitive domain knowledge of bimanual tasks potentially paves the way for other complex applications, including, but not limited to, autonomous surgery with a high level of model explainability and interpretability. Finally, we presented preliminary results to argue that incorporating hands collaboration features in deep learning-based classifiers reduces uncertainty, improves accuracy, and enhances the out-of-distribution robustness of the final model.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"6 4","pages":"1490-1501"},"PeriodicalIF":3.4,"publicationDate":"2024-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142600319","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Reinforcement Learning Approach for Real-Time Articulated Surgical Instrument 3-D Pose Reconstruction 用于实时关节化手术器械三维姿态重构的强化学习方法
IF 3.4
IEEE transactions on medical robotics and bionics Pub Date : 2024-09-19 DOI: 10.1109/TMRB.2024.3464089
Ke Fan;Ziyang Chen;Qiaoling Liu;Giancarlo Ferrigno;Elena De Momi
{"title":"A Reinforcement Learning Approach for Real-Time Articulated Surgical Instrument 3-D Pose Reconstruction","authors":"Ke Fan;Ziyang Chen;Qiaoling Liu;Giancarlo Ferrigno;Elena De Momi","doi":"10.1109/TMRB.2024.3464089","DOIUrl":"https://doi.org/10.1109/TMRB.2024.3464089","url":null,"abstract":"3D pose reconstruction of surgical instruments from images stands as a critical component in environment perception within robotic minimally invasive surgery (RMIS). The current deep learning methods rely on complex networks to enhance accuracy, making real-time implementation difficult. Moreover, diverging from a singular rigid body, surgical instruments exhibit an articulation structure, making the annotation of 3D poses more challenging. In this paper, we present a novel approach to formulate the 3D pose reconstruction of articulated surgical instruments as a Markov Decision Process (MDP). A Reinforcement Learning (RL) agent employs 2D image labels to control a virtual articulated skeleton to reproduce the 3D pose of the real surgical instrument. Firstly, a convolutional neural network is used to estimate the 2D pixel positions of joint nodes of the surgical instrument skeleton. Subsequently, the agent controls the 3D virtual articulated skeleton to align its joint nodes’ projections on the image plane with those in the real image. Validation of our proposed method is conducted using a semi-synthetic dataset with precise 3D pose labels and two real datasets, demonstrating the accuracy and efficacy of our approach. The results indicate the potential of our method in achieving real-time 3D pose reconstruction for articulated surgical instruments in the context of RMIS, addressing the challenges posed by low-texture surfaces and articulated structures.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"6 4","pages":"1458-1467"},"PeriodicalIF":3.4,"publicationDate":"2024-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142600437","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Development of a High-Precision and Large-Range FBG-Based Sensor Inspired by a Crank-Slider Mechanism for Wearable Measurement of Human Knee Joint Angles 受曲柄滑块机制启发开发基于 FBG 的高精度大范围传感器,用于穿戴式人体膝关节角度测量
IF 3.4
IEEE transactions on medical robotics and bionics Pub Date : 2024-09-19 DOI: 10.1109/TMRB.2024.3464096
Kaifeng Wang;Aofei Tian;Yupeng Hao;Chengzhi Hu;Chaoyang Shi
{"title":"Development of a High-Precision and Large-Range FBG-Based Sensor Inspired by a Crank-Slider Mechanism for Wearable Measurement of Human Knee Joint Angles","authors":"Kaifeng Wang;Aofei Tian;Yupeng Hao;Chengzhi Hu;Chaoyang Shi","doi":"10.1109/TMRB.2024.3464096","DOIUrl":"https://doi.org/10.1109/TMRB.2024.3464096","url":null,"abstract":"This article proposes a fiber Bragg grating (FBG) based angle sensor with an extensive measurement range and high precision for human knee joint measurement. The proposed sensor mainly comprises an angle-linear displacement conversion cam, a crank-slider mechanism-inspired conversion flexure, an optical fiber embedded with an FBG element, and a sensor package. The cam transforms the wide-range knee angle input into vertical linear displacement output. The conversion flexure further converts such vertical displacement into a reduced horizontal displacement/stretching applied to the optical fiber with a motion scale ratio of 6:1. The flexure design features a symmetrical structure to improve stability and depress hysteresis. The fiber is suspended on the flexure’s output beams with a two-point pasting configuration. Both theory analysis and finite element method (FEM)-based simulations revealed the linear relationship between the input angle and the fiber strain. Static and dynamic experiments have verified the performance of the proposed sensor, demonstrating a sensitivity of 62.03 pm/° with a small linearity error of 1.36% within [0, 140°]. The root mean square errors (RMSE) were 0.72° and 0.84° for angle velocities of 80°/s and 350°/s, respectively. Wearable experiments during sitting and walking have been performed to validate the effectiveness of the proposed sensor.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"6 4","pages":"1688-1698"},"PeriodicalIF":3.4,"publicationDate":"2024-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142600324","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Robot-Assisted Reduction of the Ankle Joint via Multi-Body 3D–2D Image Registration 通过多体三维-二维图像注册实现机器人辅助踝关节缩减术
IF 3.4
IEEE transactions on medical robotics and bionics Pub Date : 2024-09-19 DOI: 10.1109/TMRB.2024.3464095
R. C. Vijayan;N. M. Sheth;J. Wei;K. Venkataraman;D. Ghanem;B. Shafiq;J. H. Siewerdsen;W. Zbijewski;G. Li;K. Cleary;A. Uneri
{"title":"Robot-Assisted Reduction of the Ankle Joint via Multi-Body 3D–2D Image Registration","authors":"R. C. Vijayan;N. M. Sheth;J. Wei;K. Venkataraman;D. Ghanem;B. Shafiq;J. H. Siewerdsen;W. Zbijewski;G. Li;K. Cleary;A. Uneri","doi":"10.1109/TMRB.2024.3464095","DOIUrl":"https://doi.org/10.1109/TMRB.2024.3464095","url":null,"abstract":"Robot-assisted orthopaedic joint reduction offers enhanced precision and control across multiple axes of motion, enabling precise realignment according to predefined plans. However, the high levels of forces encountered may induce unintended anatomical motion and flex mechanical components. To address this, this work presents an approach that uses 2D fluoroscopic imaging to verify and readjust the 3D reduction path by tracking deviations from the planned trajectory. The proposed method involves a 3D-2D registration algorithm using a pair of fluoroscopic images, along with prior models of each body in the radiographic scene. This objective is formulated to couple and constrain multiple object poses (fibula, tibia, talus, and robot end effector), and incorporate novel methods for automatic view and hyperparameter selection to improve robustness. The algorithms were refined through cadaver studies and evaluated in a preclinical trial, employing a robotic system to manipulate a dislocated fibula. Studies with cadaveric specimens highlighted the joint-specific formulation’s high registration accuracy (\u0000<inline-formula> <tex-math>$Delta _{x} {=} 0.3~pm ~1$ </tex-math></inline-formula>\u0000.5 mm), further improved with the use of automatic view and hyperparameter selection (\u0000<inline-formula> <tex-math>$Delta _{x} {=} 0.2~pm ~0$ </tex-math></inline-formula>\u0000.8 mm). Preclinical studies demonstrated a high deviation between the intended and the actual path of the robotic system, which was accurately captured (\u0000<inline-formula> <tex-math>$Delta _{x}$ </tex-math></inline-formula>\u0000 1 mm) using the proposed techniques. The solution offers to close the loop on image-based guidance of robot-assisted joint reduction by tracking the robot and bones to dynamically correct the course. The approach uses standard clinical images and is expected to lower radiation exposure by providing 3D information and allowing the staff to stay clear of the x-ray beam.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"6 4","pages":"1591-1602"},"PeriodicalIF":3.4,"publicationDate":"2024-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142600316","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Single-Motor Ultraflexible Robotic (SMUFR) Humanoid Hand 单电机超柔性机器人 (SMUFR) 人形手
IF 3.4
IEEE transactions on medical robotics and bionics Pub Date : 2024-09-19 DOI: 10.1109/TMRB.2024.3464107
Quan Xiong;Dannuo Li;Xuanyi Zhou;Wenci Xin;Chao Wang;Jonathan William Ambrose;Raye Chen-Hua Yeow
{"title":"Single-Motor Ultraflexible Robotic (SMUFR) Humanoid Hand","authors":"Quan Xiong;Dannuo Li;Xuanyi Zhou;Wenci Xin;Chao Wang;Jonathan William Ambrose;Raye Chen-Hua Yeow","doi":"10.1109/TMRB.2024.3464107","DOIUrl":"https://doi.org/10.1109/TMRB.2024.3464107","url":null,"abstract":"Humanoid robotic hands have significant potential in easing human burden and augmenting human labor. This paper introduces the SMUFR hand, a compliant and dexterous robotic humanoid hand powered by tendon-driven mechanisms, and features flexible beam-based bending joints serving as rotary joints with bidirectional bending compliance that ensure safety during human-robot interaction. Despite its light weight of only 363 g without remote transmission and actuation components, the SMUFR hand can grasp and support loads of up to 4.2 kg in various orientations, manipulate objects of different sizes and shapes, and even operate underwater. Of particular note is the SMUFR hand’s lightweight and compact one-to-more actuation system, comprising six rotary pneumatic clutches (RPC) for six active Degrees of Freedom (DoFs), all powered by a single motor. Each RPC, weighing 75 g, can exert up to 23 N force on the tendon. This innovative transmission system distributes the power of a single motor across five fingers and holds potential for configuring additional RPCs. We also integrated all the components on a compact wearable vest for potential mobile humanoid robotic applications. Additionally, a mathematical model was developed to predict tendon force and joint bending using the constant curvature deformation hypothesis. Experimental validation demonstrates the durability of both the RPC and the beam-based fingers of the SMUFR hand, which are capable of enduring up to 22,000 and 30,000 cycles, respectively.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"6 4","pages":"1666-1677"},"PeriodicalIF":3.4,"publicationDate":"2024-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142600322","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信