Léa Boillereaux;Simon Le Floc’h;Franck Jourdan;Gille Camp;Arnaud Tanguy;Abderrahmane Kheddar
{"title":"A Distraction Knee-Brace and a Robotic Testbed for Tibiofemoral Load Reduction During Squatting","authors":"Léa Boillereaux;Simon Le Floc’h;Franck Jourdan;Gille Camp;Arnaud Tanguy;Abderrahmane Kheddar","doi":"10.1109/TMRB.2025.3550664","DOIUrl":"https://doi.org/10.1109/TMRB.2025.3550664","url":null,"abstract":"We design and evaluate a new knee distraction unloader brace. The proposed device conforms to the nonlinear behavior of the tibiofemoral contact force during squat motions, by means of patient-custom cams. Using pneumatic cylinders as springs, the unloading assistance provided by the brace is tailored to the patient’s pathology and adjusted during the rehabilitation process. To assess the performance of our orthosis, various tests are conducted to evaluate its efficiency in terms of tibiofemoral contact load reduction. For this purpose, a robotic test-bench, equipped with a robotic arm, emulates upper leg motion under applied forces (hybrid force-motion control). A pseudo-leg is attached to the robot end-effector, and the orthosis is mounted onto it. The test bench is instrumented with two six degrees of freedom force-torque sensors. Using these force sensors as ground truth, tibiofemoral contact force measurements are obtained with and without our orthosis and compared. A pair of cams is fabricated based on data from a patient whose information is retrieved from the Orthoload database. Experimental results demonstrate a contact force reduction of up to 100% within the force range corresponding to the robot’s maximum capacity.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"7 2","pages":"621-632"},"PeriodicalIF":3.4,"publicationDate":"2025-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144084781","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Development and Validation of a Robot-Assisted Retraction System for Orthopedic Surgery","authors":"Xiaolong Zhu;Yuzhen Jiang;Rui He;Changsheng Li;Xingguang Duan","doi":"10.1109/TMRB.2025.3550648","DOIUrl":"https://doi.org/10.1109/TMRB.2025.3550648","url":null,"abstract":"Tissue retraction, one of the basic steps in orthopedic surgery, is related to the smooth entry of surgical instruments into the surgical area and provides a clear surgical perspective for surgeons. This work introduces the first robot-assisted retraction system (RARS) specifically designed for orthopedic surgery. The RARS allows surgeons to set a safe retraction force and roughly set the posture of the retraction device at the beginning. To ensure that the RARS automatically completes the task in a safe manner, we propose a safety control framework that employs iterative enhanced control based on interaction model to handle the retraction interaction problem, and avoids interference with the operation of surgeons through null-space optimization. First, we performed a performance test of the RARS on a phantom model, and the results showed that the maximum tracking error of the retraction force was 0.51N, demonstrating satisfactory tracking performance. Second, we validated the effectiveness of null-space optimization by observing the evolution of joint positions. Finally, we conducted experiments on in-vivo animal and the results showed that the proposed RARS exhibited superior performance in safe retraction force tracking accuracy and tissue damage compared to traditional manual retractions.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"7 2","pages":"492-501"},"PeriodicalIF":3.4,"publicationDate":"2025-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144084735","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Development of a Hybrid Measurement System for Surgical Instrument Motion of Laparoscopic Surgery","authors":"Koki Ebina;Takashige Abe;Lingbo Yan;Kiyohiko Hotta;Chihiro Kamijo;Madoka Higuchi;Masafumi Kon;Hiroshi Kikuchi;Haruka Miyata;Ryuji Matsumoto;Takahiro Osawa;Sachiyo Murai;Yo Kurashima;Toshiaki Shichinohe;Masahiko Watanabe;Shunsuke Komizunai;Teppei Tsujita;Kazuya Sase;Xiaoshuai Chen;Taku Senoo;Nobuo Shinohara;Atsushi Konno","doi":"10.1109/TMRB.2025.3550666","DOIUrl":"https://doi.org/10.1109/TMRB.2025.3550666","url":null,"abstract":"Laparoscopic surgery has become a common surgical technique owing to its minimal invasiveness. However, surgeons require advanced techniques, and several studies have evaluated surgical skills through motion measurements to improve skill proficiency. However, existing measurement systems have a low tolerance for occlusion and are difficult to use in operating rooms with many obstacles. Therefore, a hybrid measurement system was developed for laparoscopic surgery. This system consists of an inertial measurement unit (IMU), a distance sensor, and an optical motion capture (MoCap). When MoCap data are unavailable, surgical instrument motion is calculated using the IMU and distance sensor data, and when it is available, the IMU drift is corrected using MoCap data. The MoCap markers were arranged individually, thus facilitating the measurement of multiple instruments simultaneously. The validation experiment in the wet-lab training confirmed that the error was smaller than that measured using MoCap alone, and the subjects expressed that the subjective disturbance caused by the sensors was very small during the procedure. The measurement experiment was conducted in cadaver surgical training, and 15 cases of nephrectomy were successfully recorded. This system facilitated highly accurate measurements during practical surgical training and surgical skills analysis.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"7 2","pages":"550-561"},"PeriodicalIF":3.4,"publicationDate":"2025-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144084738","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Irrelevant Locomotion Intention Detection for Myoelectric Assistive Lower Limb Robot Control","authors":"Xiaoyu Song;Jiaqing Liu;Heng Pan;Haotian Rao;Can Wang;Xinyu Wu","doi":"10.1109/TMRB.2025.3550736","DOIUrl":"https://doi.org/10.1109/TMRB.2025.3550736","url":null,"abstract":"In this study, we propose a robust myoelectric intention recognition framework to recognize human locomotion mode and detect irrelevant locomotion intention. The framework is integrated into the control system of the lower limb exoskeleton robot for experimental validation. Most conventional electromyography (EMG) intention detection methods aim to accurately detect the target motion intentions but ignore the possible effects of irrelevant intentions. In traditional action intention recognition strategies, most researchers did not consider entering irrelevant action intentions into the model during training. Therefore, when using a classification model, if irrelevant action intentions are input, the model will still recognize it as a type of target action intention. That can lead to incorrect recognition results, which will cause the robot to perform wrong movements and pose a safety risk to the wearer. To detect and reject irrelevant motion intentions, we first used the dual-purpose autoencoder-guided temporal convolution network (DA-TCN) to obtain discriminative features of the surface EMG signal. Autoencoders (AE)/Variable Autoencoders (VAE) are then trained for each of the seven deep features of the target motion intention. In addition, irrelevant motion intentions are detected according to the value of their reconstruction error. The recall rate of this method for the detection of irrelevant motion intentions exceeds 99% and the accuracy rate exceeds 99%.At the same time, we replaced the TCN with the LSTM model and compared the performance of the two after adding irrelevant motion discrimination. We collected data on seven goals and three unrelated motor intentions from seven experimenters for testing and completed an online experimental validation. The motion recognition accuracy of all the experimenters can be maintained above 86%.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"7 2","pages":"655-665"},"PeriodicalIF":3.4,"publicationDate":"2025-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144084801","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Toward Lung Ultrasound Automation: Fully Autonomous Robotic Longitudinal and Transverse Scans Along Intercostal Spaces","authors":"Long Lei;Yingbai Hu;Zixing Jiang;Juzheng Miao;Xiao Luo;Yu Zhang;Qiong Wang;Shujun Wang;Zheng Li;Pheng-Ann Heng","doi":"10.1109/TMRB.2025.3550663","DOIUrl":"https://doi.org/10.1109/TMRB.2025.3550663","url":null,"abstract":"Lung ultrasound scanning is essential for diagnosing lung diseases. The scan effectiveness critically depends on both longitudinal and transverse scans through intercostal spaces to reduce rib shadowing interference, as well as maintaining the probe perpendicular to pleura for pathological artifact generation. Achieving this level of scan quality often depends heavily on the experience of doctors. Robotic ultrasound scanning shows promise, but currently lacks a direct path planning method for intercostal scanning, and probe orientation does not consider imaging differences between lungs and solid organs. In this paper, we aim to fully automate two fundamental operations in lung ultrasound scanning: longitudinal and transverse scans. We propose pioneering path planning methods along intercostal spaces and innovative solutions for adaptive probe posture adjustment using real-time pleural line feedback, specifically addressing the unique characteristics of lung ultrasound scanning. This ensures the acquisition of high-quality, diagnostically meaningful ultrasound images. In addition, we develop a robotic lung ultrasound system to validate the proposed methods. Extensive experimental results on two volunteers and a chest phantom confirm the efficacy of our methods, and demonstrate the system’s feasibility in automated lung ultrasound examinations. Our work lays a solid foundation for automated robotic complete lung scanning.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"7 2","pages":"768-781"},"PeriodicalIF":3.4,"publicationDate":"2025-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143949177","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yao Liu;Yong Wang;Jing Xiao;Xu He;Cheng Wang;Jianjun Zhu;Pengju Lv;Huixia Cai;Lige Qiu;Yizhun Zhu;Yong Li;Ligong Lu
{"title":"Computed Tomography and Ultrasound-Guided Robotic Assistance in Percutaneous Puncture in Abdominal Phantom and Porcine Liver Models","authors":"Yao Liu;Yong Wang;Jing Xiao;Xu He;Cheng Wang;Jianjun Zhu;Pengju Lv;Huixia Cai;Lige Qiu;Yizhun Zhu;Yong Li;Ligong Lu","doi":"10.1109/TMRB.2025.3550644","DOIUrl":"https://doi.org/10.1109/TMRB.2025.3550644","url":null,"abstract":"Percutaneous puncture is a pivotal technique in diagnosing and treating hepatic lesions; however, traditional manual puncture methods rely heavily on the expertise of medical practitioners. This study aimed to evaluate the feasibility, safety, and efficacy of percutaneous needle placement through the innovative utilization of computed tomography-ultrasound fusion-guided robotic assistance in abdominal phantom and porcine liver models. The abdominal phantom and eight Bama miniature pigs were selected as experimental subjects. Two puncture methods (handheld and robot-assisted puncture) were administered to each simulated tumor with an 18-gauge biopsy needle. All pigs exhibited stable conditions without any complications following the puncture. The Euclidean distance between the needle tip and the predetermined target point of robot-assisted puncture was <inline-formula> <tex-math>$3.30~pm ~1$ </tex-math></inline-formula>.48 mm in the pig model and <inline-formula> <tex-math>$2.15~pm ~0$ </tex-math></inline-formula>.82 mm in the phantom model. The planning time required for the physician to perform robot-assisted needle insertion was <inline-formula> <tex-math>$8.25~pm ~2.59$ </tex-math></inline-formula> min. Robot-assisted needle insertion in percutaneous puncture was accurate and safe in the pig liver model, highlighting its feasibility and potential clinical application.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"7 2","pages":"542-549"},"PeriodicalIF":3.4,"publicationDate":"2025-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144084777","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Guest Editorial BioRob2024","authors":"Leonardo Cappello;Daniele Guarnera","doi":"10.1109/TMRB.2025.3532156","DOIUrl":"https://doi.org/10.1109/TMRB.2025.3532156","url":null,"abstract":"","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"7 1","pages":"3-5"},"PeriodicalIF":3.4,"publicationDate":"2025-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10908099","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143529882","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"IEEE Transactions on Medical Robotics and Bionics Information for Authors","authors":"","doi":"10.1109/TMRB.2025.3539974","DOIUrl":"https://doi.org/10.1109/TMRB.2025.3539974","url":null,"abstract":"","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"7 1","pages":"C4-C4"},"PeriodicalIF":3.4,"publicationDate":"2025-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10908100","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143521356","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"IEEE Transactions on Medical Robotics and Bionics Publication Information","authors":"","doi":"10.1109/TMRB.2025.3539970","DOIUrl":"https://doi.org/10.1109/TMRB.2025.3539970","url":null,"abstract":"","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"7 1","pages":"C2-C2"},"PeriodicalIF":3.4,"publicationDate":"2025-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10908102","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143529889","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"IEEE Transactions on Medical Robotics and Bionics Society Information","authors":"","doi":"10.1109/TMRB.2025.3539972","DOIUrl":"https://doi.org/10.1109/TMRB.2025.3539972","url":null,"abstract":"","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"7 1","pages":"C3-C3"},"PeriodicalIF":3.4,"publicationDate":"2025-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10908103","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143521462","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}