International Journal of Medical Robotics and Computer Assisted Surgery最新文献

筛选
英文 中文
Transoral robotic surgery in the diagnosis and treatment of primary unknown head and neck squamous cell carcinoma: A preliminary single centre experience 经口机器人手术诊断和治疗原发性未知头颈部鳞状细胞癌:一个单中心的初步经验
IF 2.3 3区 医学
Yinghui Zhi, Yabing Zhang, Bin Zhang
{"title":"Transoral robotic surgery in the diagnosis and treatment of primary unknown head and neck squamous cell carcinoma: A preliminary single centre experience","authors":"Yinghui Zhi,&nbsp;Yabing Zhang,&nbsp;Bin Zhang","doi":"10.1002/rcs.2652","DOIUrl":"https://doi.org/10.1002/rcs.2652","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>Squamous cell carcinoma of unknown primary (CUP) in the head and neck is difficult to diagnose and treat. This report outlines 11 cases of CUP treated with transoral robotic surgery (TORS), aimed at investigating the diagnostic efficiency of primary tumour and radical resection effectiveness of TORS.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>11 cases of CUP among 68 oropharyngeal cancer patients treated by TORS were analysed retrospectively.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>All the 11 cases received TORS with cervical lymph node dissection. Primary tumours were found in 8 cases (72.7%), 4 cases in the palatine tonsil and 4 cases in the base of the tongue. The average diameter of the primary tumour was 1.65 cm. All patients resumed eating by mouth within 24 h, no tracheotomy, no pharyngeal fistula and no postoperative death. The 3-year disease-free survival rate was 91%.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>TORS can improve the diagnostic efficiency of primary tumour of CUP and achieve good oncology and functional results.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2024-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141439709","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
3D evaluation model of facial aesthetics based on multi-input 3D convolution neural networks for orthognathic surgery 基于多输入三维卷积神经网络的面部美学三维评估模型,用于正颌外科手术。
IF 2.5 3区 医学
Qingchuan Ma, Etsuko Kobayashi, Siao Jin, Ken Masamune, Hideyuki Suenaga
{"title":"3D evaluation model of facial aesthetics based on multi-input 3D convolution neural networks for orthognathic surgery","authors":"Qingchuan Ma,&nbsp;Etsuko Kobayashi,&nbsp;Siao Jin,&nbsp;Ken Masamune,&nbsp;Hideyuki Suenaga","doi":"10.1002/rcs.2651","DOIUrl":"10.1002/rcs.2651","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>Quantitative evaluation of facial aesthetics is an important but also time-consuming procedure in orthognathic surgery, while existing 2D beauty-scoring models are mainly used for entertainment with less clinical impact.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>A deep-learning-based 3D evaluation model DeepBeauty3D was designed and trained using 133 patients' CT images. The customised image preprocessing module extracted the skeleton, soft tissue, and personal physical information from raw DICOM data, and the predicting network module employed 3-input-2-output convolution neural networks (CNN) to receive the aforementioned data and output aesthetic scores automatically.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>Experiment results showed that this model predicted the skeleton and soft tissue score with 0.231 ± 0.218 (4.62%) and 0.100 ± 0.344 (2.00%) accuracy in 11.203 ± 2.824 s from raw CT images.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusion</h3>\u0000 \u0000 <p>This study provided an end-to-end solution using real clinical data based on 3D CNN to quantitatively evaluate facial aesthetics by considering three anatomical factors simultaneously, showing promising potential in reducing workload and bridging the surgeon-patient aesthetics perspective gap.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/rcs.2651","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141319199","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Use of a fluoroscopy-based robotic-assisted total hip arthroplasty system resulted in greater improvements in hip-specific outcome measures at one-year compared to a CT-based robotic-assisted system 与基于CT的机器人辅助系统相比,使用基于透视的机器人辅助全髋关节置换术系统一年后,髋关节特异性结果指标的改善幅度更大。
IF 2.5 3区 医学
Christian B. Ong, Graham B. J. Buchan, Christian J. Hecht II, David Liu, Joshua Petterwood, Atul F. Kamath
{"title":"Use of a fluoroscopy-based robotic-assisted total hip arthroplasty system resulted in greater improvements in hip-specific outcome measures at one-year compared to a CT-based robotic-assisted system","authors":"Christian B. Ong,&nbsp;Graham B. J. Buchan,&nbsp;Christian J. Hecht II,&nbsp;David Liu,&nbsp;Joshua Petterwood,&nbsp;Atul F. Kamath","doi":"10.1002/rcs.2650","DOIUrl":"10.1002/rcs.2650","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>The purpose of this study was to compare one-year patient reported outcome measures between a novel fluoroscopy-based robotic-assisted (FL-RTHA) system and an existing computerised tomography-based robotic assisted (CT-RTHA) system.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>A review of 85 consecutive FL-RTHA and 125 consecutive CT-RTHA was conducted. Outcomes included one-year post-operative Veterans RAND-12 (VR-12) Physical (PCS)/Mental (MCS), Hip Disability and Osteoarthritis Outcome (HOOS) Pain/Physical Function (PS)/Joint replacement, and University of California Los Angeles (UCLA) Activity scores.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>The FL-RTHA cohort had lower pre-operative VR-12 PCS, HOOS Pain, HOOS-PS, HOOS-JR, and UCLA Activity scores compared with patients in the CT-RTHA cohort. The FL-RTHA cohort reported greater improvements in HOOS-PS scores (−41.54 vs. −36.55; <i>p</i> = 0.028) than the CT-RTHA cohort. Both cohorts experienced similar rates of major post-operative complications, and had similar radiographic outcomes.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>Use of the fluoroscopy-based robotic system resulted in greater improvements in HOOS-PS in one-year relative to the CT-based robotic technique.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-06-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141297539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Augmented-reality-based surgical navigation for endoscope retrograde cholangiopancreatography: A phantom study 基于增强现实技术的内镜逆行胰胆管造影手术导航:模拟研究
IF 2.5 3区 医学
Zhipeng Lin, Zhuoyue Yang, Ranyang Li, Shangyu Sun, Bin Yan, Yongming Yang, Hao Liu, Junjun Pan
{"title":"Augmented-reality-based surgical navigation for endoscope retrograde cholangiopancreatography: A phantom study","authors":"Zhipeng Lin,&nbsp;Zhuoyue Yang,&nbsp;Ranyang Li,&nbsp;Shangyu Sun,&nbsp;Bin Yan,&nbsp;Yongming Yang,&nbsp;Hao Liu,&nbsp;Junjun Pan","doi":"10.1002/rcs.2649","DOIUrl":"10.1002/rcs.2649","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>Endoscope retrograde cholangiopancreatography is a standard surgical treatment for gallbladder and pancreatic diseases. However, surgeons is at high risk and require sufficient surgical experience and skills.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>(1) The simultaneous localisation and mapping technique to reconstruct the surgical environment. (2) The preoperative 3D model is transformed into the intraoperative video environment to implement the multi-modal fusion. (3) A framework for virtual-to-real projection based on hand-eye alignment. For the purpose of projecting the 3D model onto the imaging plane of the camera, it uses position data from electromagnetic sensors.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>Our AR-assisted navigation system can accurately guide physicians, which means a distance of registration error to be restricted to under 5 mm and a projection error of 5.76 ± 2.13, and the intubation procedure is done at 30 frames per second.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>Coupled with clinical validation and user studies, both the quantitative and qualitative results indicate that our navigation system has the potential to be highly useful in clinical practice.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141285535","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Force/position tracking control of fracture reduction robot based on nonlinear disturbance observer and neural network 基于非线性干扰观测器和神经网络的断裂修复机器人力/位置跟踪控制。
IF 2.5 3区 医学
Jintao Lei, Zhuangzhuang Wang
{"title":"Force/position tracking control of fracture reduction robot based on nonlinear disturbance observer and neural network","authors":"Jintao Lei,&nbsp;Zhuangzhuang Wang","doi":"10.1002/rcs.2639","DOIUrl":"10.1002/rcs.2639","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>For the fracture reduction robot, the position tracking accuracy and compliance are affected by dynamic loads from muscle stretching, uncertainties in robot dynamics models, and various internal and external disturbances.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>A control method that integrates a Radial Basis Function Neural Network (RBFNN) with Nonlinear Disturbance Observer is proposed to enhance position tracking accuracy. Additionally, an admittance control is employed for force tracking to enhance the robot's compliance, thereby improving the safety.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>Experiments are conducted on a long bone fracture model with simulated muscle forces and the results demonstrate that the position tracking error is less than ±0.2 mm, the angular displacement error is less than ±0.3°, and the maximum force tracking error is 26.28 N. This result can meet surgery requirements.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>The control method shows promising outcomes in enhancing the safety and accuracy of long bone fracture reduction with robotic assistance.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141285546","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Radical prostatectomy using the Hinotori robot-assisted surgical system: Docking-free design may contribute to reduction in postoperative pain 使用日之鸟机器人辅助手术系统进行根治性前列腺切除术:无对接设计有助于减轻术后疼痛。
IF 2.5 3区 医学
Yutaro Sasaki, Yoshito Kusuhara, Takuro Oyama, Mitsuki Nishiyama, Saki Kobayashi, Kei Daizumoto, Ryotaro Tomida, Yoshiteru Ueno, Tomoya Fukawa, Kunihisa Yamaguchi, Yasuyo Yamamoto, Masayuki Takahashi, Hiroomi Kanayama, Junya Furukawa
{"title":"Radical prostatectomy using the Hinotori robot-assisted surgical system: Docking-free design may contribute to reduction in postoperative pain","authors":"Yutaro Sasaki,&nbsp;Yoshito Kusuhara,&nbsp;Takuro Oyama,&nbsp;Mitsuki Nishiyama,&nbsp;Saki Kobayashi,&nbsp;Kei Daizumoto,&nbsp;Ryotaro Tomida,&nbsp;Yoshiteru Ueno,&nbsp;Tomoya Fukawa,&nbsp;Kunihisa Yamaguchi,&nbsp;Yasuyo Yamamoto,&nbsp;Masayuki Takahashi,&nbsp;Hiroomi Kanayama,&nbsp;Junya Furukawa","doi":"10.1002/rcs.2648","DOIUrl":"10.1002/rcs.2648","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>The docking-free design of the Japanese Hinotori surgical robotic system allows the robotic arm to avoid trocar grasping, thereby minimising excessive abdominal wall stress. The aim of this study was to evaluate the safety and efficacy of robotic-assisted radical prostatectomy (RARP) using the Hinotori system and to explore the potential contribution of its docking-free design to postoperative pain reduction.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>This study reviewed the clinical records of 94 patients who underwent RARP: 48 patients in the Hinotori group and 46 in the da Vinci Xi group.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>Hinotori group had significantly longer operative and console times (<i>p</i> = 0.030 and <i>p</i> = 0.029, respectively). Perioperative complications and oncologic outcomes did not differ between the two groups. On postoperative day 4, the rate of decline from the maximum visual analogue scale score was marginally significant in the Hinotori group (<i>p</i> = 0.062).</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>The docking-free design may contribute to reducing postoperative pain.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141187144","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A haptic guidance system for simulated catheter navigation with different kinaesthetic feedback profiles 用于模拟导管导航的触觉引导系统,具有不同的动觉反馈曲线。
IF 2.5 3区 医学
Taha Abbasi-Hashemi, Farrokh Janabi-Sharifi, Asim N. Cheema, Kourosh Zareinia
{"title":"A haptic guidance system for simulated catheter navigation with different kinaesthetic feedback profiles","authors":"Taha Abbasi-Hashemi,&nbsp;Farrokh Janabi-Sharifi,&nbsp;Asim N. Cheema,&nbsp;Kourosh Zareinia","doi":"10.1002/rcs.2638","DOIUrl":"10.1002/rcs.2638","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>This paper proposes a haptic guidance system to improve catheter navigation within a simulated environment.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>Three force profiles were constructed to evaluate the system: collision prevention; centreline navigation; and a novel force profile of reinforcement learning (RL). All force profiles were evaluated from the left common iliac to the right atrium.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>Our findings show that providing haptic feedback improved surgical safety compared to visual-only feedback. If staying inside the vasculature is the priority, RL provides the safest option. It is also shown that the performance of each force profile varies in different anatomical regions.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusion</h3>\u0000 \u0000 <p>The implications of these findings are significant, as they hold the potential to improve how and when haptic feedback is applied for cardiovascular intervention.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141184716","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A back propagation neural network based respiratory motion modelling method 基于反向传播神经网络的呼吸运动建模方法。
IF 2.5 3区 医学
Shan Jiang, Bowen Li, Zhiyong Yang, Yuhua Li, Zeyang Zhou
{"title":"A back propagation neural network based respiratory motion modelling method","authors":"Shan Jiang,&nbsp;Bowen Li,&nbsp;Zhiyong Yang,&nbsp;Yuhua Li,&nbsp;Zeyang Zhou","doi":"10.1002/rcs.2647","DOIUrl":"10.1002/rcs.2647","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>This study presents the development of a backpropagation neural network-based respiratory motion modelling method (BP-RMM) for precisely tracking arbitrary points within lung tissue throughout free respiration, encompassing deep inspiration and expiration phases.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>Internal and external respiratory data from four-dimensional computed tomography (4DCT) are processed using various artificial intelligence algorithms. Data augmentation through polynomial interpolation is employed to enhance dataset robustness. A BP neural network is then constructed to comprehensively track lung tissue movement.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>The BP-RMM demonstrates promising accuracy. In cases from the public 4DCT dataset, the average target registration error (TRE) between authentic deep respiration phases and those forecasted by BP-RMM for 75 marked points is 1.819 mm. Notably, TRE for normal respiration phases is significantly lower, with a minimum error of 0.511 mm.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>The proposed method is validated for its high accuracy and robustness, establishing it as a promising tool for surgical navigation within the lung.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-05-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141158124","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
ERegPose: An explicit regression based 6D pose estimation for snake-like wrist-type surgical instruments ERegPose:基于显式回归的蛇形腕式手术器械 6D 姿势估计。
IF 2.5 3区 医学
Jinhua Li, Zhengyang Ma, Xinan Sun, He Su
{"title":"ERegPose: An explicit regression based 6D pose estimation for snake-like wrist-type surgical instruments","authors":"Jinhua Li,&nbsp;Zhengyang Ma,&nbsp;Xinan Sun,&nbsp;He Su","doi":"10.1002/rcs.2640","DOIUrl":"10.1002/rcs.2640","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>Accurately estimating the 6D pose of snake-like wrist-type surgical instruments is challenging due to their complex kinematics and flexible design.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>We propose ERegPose, a comprehensive strategy for precise 6D pose estimation. The strategy consists of two components: ERegPoseNet, an original deep neural network model designed for explicit regression of the instrument's 6D pose, and an annotated in-house dataset of simulated surgical operations. To capture rotational features, we employ an Single Shot multibox Detector (SSD)-like detector to generate bounding boxes of the instrument tip.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>ERegPoseNet achieves an error of 1.056 mm in 3D translation, 0.073 rad in 3D rotation, and an average distance (ADD) metric of 3.974 mm, indicating an overall spatial transformation error. The necessity of the SSD-like detector and L1 loss is validated through experiments.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>ERegPose outperforms existing approaches, providing accurate 6D pose estimation for snake-like wrist-type surgical instruments. Its practical applications in various surgical tasks hold great promise.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141094787","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A new ring fixator system for automated bone fixation 用于自动骨固定的新型环形固定器系统。
IF 2.5 3区 医学
Ahmet Aydi(ı)n, M. Kerem U(Ü)n
{"title":"A new ring fixator system for automated bone fixation","authors":"Ahmet Aydi(ı)n,&nbsp;M. Kerem U(Ü)n","doi":"10.1002/rcs.2637","DOIUrl":"10.1002/rcs.2637","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>In the field of orthopaedics, external fixators are commonly employed for treating extremity fractures and deformities. Computer-assisted systems offer a promising and less error-prone treatment alternative to manual fixation by utilising a software to plan treatments based on radiological and clinical data. Nevertheless, existing computer-assisted systems have limitations and constraints.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>This work represents the culmination of a project aimed at developing a new automatised fixation system and a corresponding software to minimise human intervention and associated errors, and the developed system incorporates enhanced functionalities and has fewer constraints compared to existing systems.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>The automatised fixation system and its graphical user interface (GUI) demonstrate promising results in terms of accuracy, efficiency, and reliability.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusion</h3>\u0000 \u0000 <p>The developed fixation system and its accompanying GUI represent an improvement in computer-assisted fixation systems. Future research may focus on further refining the system and conducting clinical trials.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/rcs.2637","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141088785","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信