{"title":"Transoral robotic surgery in the diagnosis and treatment of primary unknown head and neck squamous cell carcinoma: A preliminary single centre experience","authors":"Yinghui Zhi, Yabing Zhang, Bin Zhang","doi":"10.1002/rcs.2652","DOIUrl":"https://doi.org/10.1002/rcs.2652","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>Squamous cell carcinoma of unknown primary (CUP) in the head and neck is difficult to diagnose and treat. This report outlines 11 cases of CUP treated with transoral robotic surgery (TORS), aimed at investigating the diagnostic efficiency of primary tumour and radical resection effectiveness of TORS.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>11 cases of CUP among 68 oropharyngeal cancer patients treated by TORS were analysed retrospectively.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>All the 11 cases received TORS with cervical lymph node dissection. Primary tumours were found in 8 cases (72.7%), 4 cases in the palatine tonsil and 4 cases in the base of the tongue. The average diameter of the primary tumour was 1.65 cm. All patients resumed eating by mouth within 24 h, no tracheotomy, no pharyngeal fistula and no postoperative death. The 3-year disease-free survival rate was 91%.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>TORS can improve the diagnostic efficiency of primary tumour of CUP and achieve good oncology and functional results.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.3,"publicationDate":"2024-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141439709","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Qingchuan Ma, Etsuko Kobayashi, Siao Jin, Ken Masamune, Hideyuki Suenaga
{"title":"3D evaluation model of facial aesthetics based on multi-input 3D convolution neural networks for orthognathic surgery","authors":"Qingchuan Ma, Etsuko Kobayashi, Siao Jin, Ken Masamune, Hideyuki Suenaga","doi":"10.1002/rcs.2651","DOIUrl":"10.1002/rcs.2651","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>Quantitative evaluation of facial aesthetics is an important but also time-consuming procedure in orthognathic surgery, while existing 2D beauty-scoring models are mainly used for entertainment with less clinical impact.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>A deep-learning-based 3D evaluation model DeepBeauty3D was designed and trained using 133 patients' CT images. The customised image preprocessing module extracted the skeleton, soft tissue, and personal physical information from raw DICOM data, and the predicting network module employed 3-input-2-output convolution neural networks (CNN) to receive the aforementioned data and output aesthetic scores automatically.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>Experiment results showed that this model predicted the skeleton and soft tissue score with 0.231 ± 0.218 (4.62%) and 0.100 ± 0.344 (2.00%) accuracy in 11.203 ± 2.824 s from raw CT images.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusion</h3>\u0000 \u0000 <p>This study provided an end-to-end solution using real clinical data based on 3D CNN to quantitatively evaluate facial aesthetics by considering three anatomical factors simultaneously, showing promising potential in reducing workload and bridging the surgeon-patient aesthetics perspective gap.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/rcs.2651","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141319199","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Christian B. Ong, Graham B. J. Buchan, Christian J. Hecht II, David Liu, Joshua Petterwood, Atul F. Kamath
{"title":"Use of a fluoroscopy-based robotic-assisted total hip arthroplasty system resulted in greater improvements in hip-specific outcome measures at one-year compared to a CT-based robotic-assisted system","authors":"Christian B. Ong, Graham B. J. Buchan, Christian J. Hecht II, David Liu, Joshua Petterwood, Atul F. Kamath","doi":"10.1002/rcs.2650","DOIUrl":"10.1002/rcs.2650","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>The purpose of this study was to compare one-year patient reported outcome measures between a novel fluoroscopy-based robotic-assisted (FL-RTHA) system and an existing computerised tomography-based robotic assisted (CT-RTHA) system.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>A review of 85 consecutive FL-RTHA and 125 consecutive CT-RTHA was conducted. Outcomes included one-year post-operative Veterans RAND-12 (VR-12) Physical (PCS)/Mental (MCS), Hip Disability and Osteoarthritis Outcome (HOOS) Pain/Physical Function (PS)/Joint replacement, and University of California Los Angeles (UCLA) Activity scores.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>The FL-RTHA cohort had lower pre-operative VR-12 PCS, HOOS Pain, HOOS-PS, HOOS-JR, and UCLA Activity scores compared with patients in the CT-RTHA cohort. The FL-RTHA cohort reported greater improvements in HOOS-PS scores (−41.54 vs. −36.55; <i>p</i> = 0.028) than the CT-RTHA cohort. Both cohorts experienced similar rates of major post-operative complications, and had similar radiographic outcomes.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>Use of the fluoroscopy-based robotic system resulted in greater improvements in HOOS-PS in one-year relative to the CT-based robotic technique.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-06-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141297539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Zhipeng Lin, Zhuoyue Yang, Ranyang Li, Shangyu Sun, Bin Yan, Yongming Yang, Hao Liu, Junjun Pan
{"title":"Augmented-reality-based surgical navigation for endoscope retrograde cholangiopancreatography: A phantom study","authors":"Zhipeng Lin, Zhuoyue Yang, Ranyang Li, Shangyu Sun, Bin Yan, Yongming Yang, Hao Liu, Junjun Pan","doi":"10.1002/rcs.2649","DOIUrl":"10.1002/rcs.2649","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>Endoscope retrograde cholangiopancreatography is a standard surgical treatment for gallbladder and pancreatic diseases. However, surgeons is at high risk and require sufficient surgical experience and skills.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>(1) The simultaneous localisation and mapping technique to reconstruct the surgical environment. (2) The preoperative 3D model is transformed into the intraoperative video environment to implement the multi-modal fusion. (3) A framework for virtual-to-real projection based on hand-eye alignment. For the purpose of projecting the 3D model onto the imaging plane of the camera, it uses position data from electromagnetic sensors.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>Our AR-assisted navigation system can accurately guide physicians, which means a distance of registration error to be restricted to under 5 mm and a projection error of 5.76 ± 2.13, and the intubation procedure is done at 30 frames per second.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>Coupled with clinical validation and user studies, both the quantitative and qualitative results indicate that our navigation system has the potential to be highly useful in clinical practice.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141285535","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Force/position tracking control of fracture reduction robot based on nonlinear disturbance observer and neural network","authors":"Jintao Lei, Zhuangzhuang Wang","doi":"10.1002/rcs.2639","DOIUrl":"10.1002/rcs.2639","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>For the fracture reduction robot, the position tracking accuracy and compliance are affected by dynamic loads from muscle stretching, uncertainties in robot dynamics models, and various internal and external disturbances.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>A control method that integrates a Radial Basis Function Neural Network (RBFNN) with Nonlinear Disturbance Observer is proposed to enhance position tracking accuracy. Additionally, an admittance control is employed for force tracking to enhance the robot's compliance, thereby improving the safety.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>Experiments are conducted on a long bone fracture model with simulated muscle forces and the results demonstrate that the position tracking error is less than ±0.2 mm, the angular displacement error is less than ±0.3°, and the maximum force tracking error is 26.28 N. This result can meet surgery requirements.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>The control method shows promising outcomes in enhancing the safety and accuracy of long bone fracture reduction with robotic assistance.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141285546","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Radical prostatectomy using the Hinotori robot-assisted surgical system: Docking-free design may contribute to reduction in postoperative pain","authors":"Yutaro Sasaki, Yoshito Kusuhara, Takuro Oyama, Mitsuki Nishiyama, Saki Kobayashi, Kei Daizumoto, Ryotaro Tomida, Yoshiteru Ueno, Tomoya Fukawa, Kunihisa Yamaguchi, Yasuyo Yamamoto, Masayuki Takahashi, Hiroomi Kanayama, Junya Furukawa","doi":"10.1002/rcs.2648","DOIUrl":"10.1002/rcs.2648","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>The docking-free design of the Japanese Hinotori surgical robotic system allows the robotic arm to avoid trocar grasping, thereby minimising excessive abdominal wall stress. The aim of this study was to evaluate the safety and efficacy of robotic-assisted radical prostatectomy (RARP) using the Hinotori system and to explore the potential contribution of its docking-free design to postoperative pain reduction.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>This study reviewed the clinical records of 94 patients who underwent RARP: 48 patients in the Hinotori group and 46 in the da Vinci Xi group.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>Hinotori group had significantly longer operative and console times (<i>p</i> = 0.030 and <i>p</i> = 0.029, respectively). Perioperative complications and oncologic outcomes did not differ between the two groups. On postoperative day 4, the rate of decline from the maximum visual analogue scale score was marginally significant in the Hinotori group (<i>p</i> = 0.062).</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>The docking-free design may contribute to reducing postoperative pain.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141187144","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Taha Abbasi-Hashemi, Farrokh Janabi-Sharifi, Asim N. Cheema, Kourosh Zareinia
{"title":"A haptic guidance system for simulated catheter navigation with different kinaesthetic feedback profiles","authors":"Taha Abbasi-Hashemi, Farrokh Janabi-Sharifi, Asim N. Cheema, Kourosh Zareinia","doi":"10.1002/rcs.2638","DOIUrl":"10.1002/rcs.2638","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>This paper proposes a haptic guidance system to improve catheter navigation within a simulated environment.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>Three force profiles were constructed to evaluate the system: collision prevention; centreline navigation; and a novel force profile of reinforcement learning (RL). All force profiles were evaluated from the left common iliac to the right atrium.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>Our findings show that providing haptic feedback improved surgical safety compared to visual-only feedback. If staying inside the vasculature is the priority, RL provides the safest option. It is also shown that the performance of each force profile varies in different anatomical regions.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusion</h3>\u0000 \u0000 <p>The implications of these findings are significant, as they hold the potential to improve how and when haptic feedback is applied for cardiovascular intervention.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141184716","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A back propagation neural network based respiratory motion modelling method","authors":"Shan Jiang, Bowen Li, Zhiyong Yang, Yuhua Li, Zeyang Zhou","doi":"10.1002/rcs.2647","DOIUrl":"10.1002/rcs.2647","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>This study presents the development of a backpropagation neural network-based respiratory motion modelling method (BP-RMM) for precisely tracking arbitrary points within lung tissue throughout free respiration, encompassing deep inspiration and expiration phases.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>Internal and external respiratory data from four-dimensional computed tomography (4DCT) are processed using various artificial intelligence algorithms. Data augmentation through polynomial interpolation is employed to enhance dataset robustness. A BP neural network is then constructed to comprehensively track lung tissue movement.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>The BP-RMM demonstrates promising accuracy. In cases from the public 4DCT dataset, the average target registration error (TRE) between authentic deep respiration phases and those forecasted by BP-RMM for 75 marked points is 1.819 mm. Notably, TRE for normal respiration phases is significantly lower, with a minimum error of 0.511 mm.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>The proposed method is validated for its high accuracy and robustness, establishing it as a promising tool for surgical navigation within the lung.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-05-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141158124","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"ERegPose: An explicit regression based 6D pose estimation for snake-like wrist-type surgical instruments","authors":"Jinhua Li, Zhengyang Ma, Xinan Sun, He Su","doi":"10.1002/rcs.2640","DOIUrl":"10.1002/rcs.2640","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>Accurately estimating the 6D pose of snake-like wrist-type surgical instruments is challenging due to their complex kinematics and flexible design.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>We propose ERegPose, a comprehensive strategy for precise 6D pose estimation. The strategy consists of two components: ERegPoseNet, an original deep neural network model designed for explicit regression of the instrument's 6D pose, and an annotated in-house dataset of simulated surgical operations. To capture rotational features, we employ an Single Shot multibox Detector (SSD)-like detector to generate bounding boxes of the instrument tip.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>ERegPoseNet achieves an error of 1.056 mm in 3D translation, 0.073 rad in 3D rotation, and an average distance (ADD) metric of 3.974 mm, indicating an overall spatial transformation error. The necessity of the SSD-like detector and L1 loss is validated through experiments.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>ERegPose outperforms existing approaches, providing accurate 6D pose estimation for snake-like wrist-type surgical instruments. Its practical applications in various surgical tasks hold great promise.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141094787","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A new ring fixator system for automated bone fixation","authors":"Ahmet Aydi(ı)n, M. Kerem U(Ü)n","doi":"10.1002/rcs.2637","DOIUrl":"10.1002/rcs.2637","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>In the field of orthopaedics, external fixators are commonly employed for treating extremity fractures and deformities. Computer-assisted systems offer a promising and less error-prone treatment alternative to manual fixation by utilising a software to plan treatments based on radiological and clinical data. Nevertheless, existing computer-assisted systems have limitations and constraints.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>This work represents the culmination of a project aimed at developing a new automatised fixation system and a corresponding software to minimise human intervention and associated errors, and the developed system incorporates enhanced functionalities and has fewer constraints compared to existing systems.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>The automatised fixation system and its graphical user interface (GUI) demonstrate promising results in terms of accuracy, efficiency, and reliability.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusion</h3>\u0000 \u0000 <p>The developed fixation system and its accompanying GUI represent an improvement in computer-assisted fixation systems. Future research may focus on further refining the system and conducting clinical trials.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/rcs.2637","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141088785","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}