Kyle W. Lawrence, Akram A. Habibi, Spencer A. Ward, Claudette M. Lajam, Ran Schwarzkopf, Joshua C. Rozell
{"title":"Human versus artificial intelligence-generated arthroplasty literature: A single-blinded analysis of perceived communication, quality, and authorship source","authors":"Kyle W. Lawrence, Akram A. Habibi, Spencer A. Ward, Claudette M. Lajam, Ran Schwarzkopf, Joshua C. Rozell","doi":"10.1002/rcs.2621","DOIUrl":"10.1002/rcs.2621","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>Large language models (LLM) have unknown implications for medical research. This study assessed whether LLM-generated abstracts are distinguishable from human-written abstracts and to compare their perceived quality.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>The LLM ChatGPT was used to generate 20 arthroplasty abstracts (AI-generated) based on full-text manuscripts, which were compared to originally published abstracts (human-written). Six blinded orthopaedic surgeons rated abstracts on overall quality, communication, and confidence in the authorship source. Authorship-confidence scores were compared to a test value representing complete inability to discern authorship.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>Modestly increased confidence in human authorship was observed for human-written abstracts compared with AI-generated abstracts (<i>p</i> = 0.028), though AI-generated abstract authorship-confidence scores were statistically consistent with inability to discern authorship (<i>p</i> = 0.999). Overall abstract quality was higher for human-written abstracts (<i>p</i> = 0.019).</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>AI-generated abstracts' absolute authorship-confidence ratings demonstrated difficulty in discerning authorship but did not achieve the perceived quality of human-written abstracts. Caution is warranted in implementing LLMs into scientific writing.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139725311","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Francesco Ditonno, Antonio Franco, Celeste Manfredi, Carol L. Feng, Eugenio Bologna, Leslie Claire Licari, Ephrem O. Olweny, Srinivas Vourganti, Edward E. Cherullo, Alexander K. Chow, Riccardo Autorino
{"title":"Single port robot-assisted pyeloplasty: An early comparative outcomes analysis","authors":"Francesco Ditonno, Antonio Franco, Celeste Manfredi, Carol L. Feng, Eugenio Bologna, Leslie Claire Licari, Ephrem O. Olweny, Srinivas Vourganti, Edward E. Cherullo, Alexander K. Chow, Riccardo Autorino","doi":"10.1002/rcs.2622","DOIUrl":"https://doi.org/10.1002/rcs.2622","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>The treatment paradigm for ureteropelvic junction obstruction (UPJO) has shifted towards minimally invasive pyeloplasty. A comparison Single Port (SP) and Multi Port (MP) robot-assisted pyeloplasty (RAP) was performed.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>Data from consecutive patients undergoing SP RAP or MP RAP between January 2021 and September 2023 were collected and analysed. Co-primary outcomes were length of stay (LOS), Defense and Veterans Pain Rating Scale (DVPRS), and narcotic dose. The choice of the robotic system depended on the surgeon's preference and availability of a specific robotic platform.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>A total of 10 SP RAPs and 12 MP RAPs were identified. SP RAP patients were significantly younger [23 years (20–34)] than MP RAP [42 years (35.5–47.5), <i>p</i> < 0.01]. No difference in terms of OT (<i>p</i> = 0.6), LOS (<i>p</i> = 0.1), DVPRS (<i>p</i> = 0.2) and narcotic dose (<i>p</i> = 0.1) between the two groups was observed.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>SP RAP can be implemented without compromising surgical outcomes and potentially offering some clinical advantages.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139682940","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mehrnaz Aghanouri, Hamid Moradi, Hossein A. Alibeik, Alireza Mirbagheri
{"title":"Workspace and dexterity analysis of the hybrid mechanism master robot in Sinaflex robotic telesurgery system: An in vivo experiment","authors":"Mehrnaz Aghanouri, Hamid Moradi, Hossein A. Alibeik, Alireza Mirbagheri","doi":"10.1002/rcs.2608","DOIUrl":"https://doi.org/10.1002/rcs.2608","url":null,"abstract":"<p>Sina<sub><i>flex</i></sub> robotic telesurgery system has been introduced recently to provide ergonomic postures for the surgeon along with dexterous workspace for robotic telesurgery. The robot is described, and the forward and inverse kinematics are derived and validated by an experiment. The robot and operational workspaces and their dexterity are investigated and compared using the data collected during a dog vasectomy robotic telesurgery by Sina<sub><i>flex</i></sub>. According to the simulation results, the workspace of the end effector is as large as 914.56 × 10<sup>5</sup> mm<sup>3</sup>, which can completely cover the ergonomic human hand workspace. The dexterity of the robot for the total and operational workspace is 0.4557 and 0.6565, respectively. In terms of the workspace size and the amount of dexterity, Sina<sub><i>flex</i></sub> master robot can be considered a good choice to fulfil the requirements of the surgeon side robot in robotic telesurgery systems.</p>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-01-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139431063","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ozan Can Tatar, Mustafa Alper Akay, Elif Tatar, Semih Metin
{"title":"Unveiling new patterns: A surgical deep learning model for intestinal obstruction management","authors":"Ozan Can Tatar, Mustafa Alper Akay, Elif Tatar, Semih Metin","doi":"10.1002/rcs.2620","DOIUrl":"https://doi.org/10.1002/rcs.2620","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>Swift and accurate decision-making is pivotal in managing intestinal obstructions. This study aims to integrate deep learning and surgical expertise to enhance decision-making in intestinal obstruction cases.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>We developed a deep learning model based on the YOLOv8 framework, trained on a dataset of 700 images categorised into operated and non-operated groups, with surgical outcomes as ground truth. The model's performance was evaluated through standard metrics.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>At a confidence threshold of 0.5, the model demonstrated sensitivity of 83.33%, specificity of 78.26%, precision of 81.7%, recall of 75.1%, and [email protected] of 0.831.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>The model exhibited promising outcomes in distinguishing operative and nonoperative management cases. The fusion of deep learning with surgical expertise enriches decision-making in intestinal obstruction management. The proposed model can assist surgeons in intricate scenarios such as intestinal obstruction management and promotes the synergy between technology and clinical acumen for advancing patient care.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139111905","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Maxim Privalov, Florian Kordon, Holger Kunze, Nils Beisemann, Sven Yves Vetter, Jochen Franke, Paul Alfred Grützner, Benedict Swartman
{"title":"Software-based method for automated intraoperative planning of Schoettle Point in surgical medial patellofemoral ligament reconstruction: A comparative validation study","authors":"Maxim Privalov, Florian Kordon, Holger Kunze, Nils Beisemann, Sven Yves Vetter, Jochen Franke, Paul Alfred Grützner, Benedict Swartman","doi":"10.1002/rcs.2607","DOIUrl":"10.1002/rcs.2607","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>The aim of the study was to validate a software-based planning method for the Schoettle Point and to evaluate precision and time efficiency of its live overlay on the intraoperative X-ray.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>A software-based method was compared with surgeons' manual planning in an inter- and intrarater study. Subsequently, K-wire placement was performed with and without an overlay of the planning. The time used and the precision achieved were statistically compared.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>The average deviation between the surgeons (1.68 mm; 2.26 mm) was greater than the discrepancy between the surgeons and the software-based planning (1.30 mm; 1.38 mm). In the intrarater comparison, software-based planning provided consistent results. Live overlay showed a significantly lower positioning error (0.9 ± 0.5 mm) compared with that without overlay (3.0 ± 1.4 mm, <i>p</i> = 0.000; 3.1 ± 1.4 mm, <i>p</i> = 0.001). Live overlay did not achieve a significant time gain (<i>p</i> = 0.393; <i>p</i> = 0.678).</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusion</h3>\u0000 \u0000 <p>The software-based planning and live overlay of the Schoettle Point improves surgical precision without negatively affecting time efficiency.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-01-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139096522","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Katherine E. Riojas, Trevor L. Bruns, Josephine Granna, Miriam R. Smetak, Robert F. Labadie, Robert J. Webster III
{"title":"Towards inferring positioning of straight cochlear-implant electrode arrays during insertion using real-time impedance sensing","authors":"Katherine E. Riojas, Trevor L. Bruns, Josephine Granna, Miriam R. Smetak, Robert F. Labadie, Robert J. Webster III","doi":"10.1002/rcs.2609","DOIUrl":"10.1002/rcs.2609","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>Cochlear-implant electrode arrays (EAs) are currently inserted with limited feedback, and impedance sensing has recently shown promise for EA localisation.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>We investigate the use of impedance sensing to infer the progression of an EA during insertion.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>We show that the access resistance component of bipolar impedance sensing can detect when a straight EA reaches key anatomical locations in a plastic cochlea and when each electrode contact enters/exits the cochlea. We also demonstrate that dual-sided electrode contacts can provide useful proximity information and show the real-time relationship between impedance and wall proximity in a cadaveric cochlea for the first time.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusion</h3>\u0000 \u0000 <p>The access resistance component of bipolar impedance sensing has high potential for estimating positioning information of EAs relative to anatomy during insertion. Main limitations of this work include using saline as a surrogate for human perilymph in ex vivo models and using only one type of EA.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2024-01-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139094230","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chunyuan Shi, Jingdong Zhao, Dapeng Yang, Li Jiang
{"title":"i-MYO: A multi-grasp prosthetic hand control system based on gaze movements, augmented reality, and myoelectric signals","authors":"Chunyuan Shi, Jingdong Zhao, Dapeng Yang, Li Jiang","doi":"10.1002/rcs.2617","DOIUrl":"10.1002/rcs.2617","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>Controlling a multi-grasp prosthetic hand still remains a challenge. This study explores the influence of merging gaze movements and augmented reality in bionics on improving prosthetic hand control.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>A control system based on gaze movements, augmented reality, and myoelectric signals (i-MYO) was proposed. In the i-MYO, the GazeButton was introduced into the controller to detect the grasp-type intention from the eye-tracking signals, and the proportional velocity scheme based on the i-MYO was used to control hand movement.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>The able-bodied subjects with no prior training successfully transferred objects in 91.6% of the cases and switched the optimal grasp types in 97.5%. The patient could successfully trigger the EMG to control the hand holding the objects in 98.7% of trials in around 3.2 s and spend around 1.3 s switching the optimal grasp types in 99.2% of trials.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>Merging gaze movements and augmented reality in bionics can widen the control bandwidth of prosthetic hand. With the help of i-MYO, the subjects can control a prosthetic hand using six grasp types if they can manipulate two muscle signals and gaze movement.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2023-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139069788","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Depeng Liu, Gang Li, Shuyuan Wang, Zixuan Liu, Yanzhou Wang, Laura Connolly, David E. Usevitch, Guofeng Shen, Kevin Cleary, Iulian Iordachita
{"title":"An magnetic resonance conditional robot for lumbar spinal injection: Development and preliminary validation","authors":"Depeng Liu, Gang Li, Shuyuan Wang, Zixuan Liu, Yanzhou Wang, Laura Connolly, David E. Usevitch, Guofeng Shen, Kevin Cleary, Iulian Iordachita","doi":"10.1002/rcs.2618","DOIUrl":"10.1002/rcs.2618","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Purpose</h3>\u0000 \u0000 <p>This work presents the design and preliminary validation of a Magnetic Resonance (MR) conditional robot for lumbar injection for the treatment of lower back pain.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>This is a 4-degree-of-freedom (DOF) robot that is 200 × 230 × 130 mm3 in volume and has a mass of 0.8 kg. Its lightweight and compact features allow it to be directly affixed to patient's back, establishing a rigid connection, thus reducing positional errors caused by patient movements during treatment.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>To validate the positioning accuracy of the needle by the robot, an electromagnetic (EM) tracking system and a needle with an EM sensor embedded in the tip were used for the free space evaluation with position accuracy of 0.88 ± 0.46 mm and phantom mock insertions using the Loop-X CBCT scanner with target position accuracy of 3.62 ± 0.92 mm.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusion</h3>\u0000 \u0000 <p>Preliminary experiments demonstrated that the proposed robot showed improvements and benefits in its rotation range, flexible needle adjustment, and sensor protection compared with previous and existing systems, offering broader clinical applications.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2023-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139069789","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Feng Qu, Min Zhang, Weili Shi, Wei He, Zhengang Jiang
{"title":"Transformer-based 2D/3D medical image registration for X-ray to CT via anatomical features","authors":"Feng Qu, Min Zhang, Weili Shi, Wei He, Zhengang Jiang","doi":"10.1002/rcs.2619","DOIUrl":"10.1002/rcs.2619","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>2D/3D medical image registration is one of the key technologies for surgical navigation systems to perform pose estimation and achieve accurate positioning, which still remains challenging. The purpose of this study is to introduce a new method for X-ray to CT 2D/3D registration and conduct a feasibility study.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>In this study, a 2D/3D affine registration method based on feature point detection is investigated. It combines the morphological and edge features of spinal images to accurately extract feature points from the images, and uses graph neural networks to aggregate anatomical features of different points to increase the local detail information. Meanwhile, global and positional information are extracted by the Swin Transformer.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>The results indicate that the proposed method has shown improvements in both accuracy and success ratio compared with other methods. The mean target registration error value reached up to 0.31 mm; meanwhile, the runtime overhead was much lower, achieving an average runtime of about 0.6 s. This ultimately improves the registration accuracy and efficiency, demonstrating the effectiveness of the proposed method.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusions</h3>\u0000 \u0000 <p>The proposed method can provide more comprehensive image information and shows good prospects for pose estimation and achieving accurate positioning in surgical navigation systems.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2023-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139069743","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tiancheng Li, Peter Walker, Richardo Khonasty, Victor A. van de Graaf, Eric Yelf, Liang Zhao, Shoudong Huang
{"title":"Robotic-assisted burring in total hip replacement: A new surgical technique to optimise acetabular preparation","authors":"Tiancheng Li, Peter Walker, Richardo Khonasty, Victor A. van de Graaf, Eric Yelf, Liang Zhao, Shoudong Huang","doi":"10.1002/rcs.2615","DOIUrl":"10.1002/rcs.2615","url":null,"abstract":"<div>\u0000 \u0000 \u0000 <section>\u0000 \u0000 <h3> Background</h3>\u0000 \u0000 <p>In Total Hip replacement (THR) surgery, a critical step is to cut an accurate hemisphere into the acetabulum so that the component can be fitted accurately and obtain early stability. This study aims to determine whether burring rather than reaming the acetabulum can achieve greater accuracy in the creation of this hemisphere.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Methods</h3>\u0000 \u0000 <p>A preliminary robotic system was developed to demonstrate the feasibility of burring the acetabulum using the Universal Robot (UR10). The study will describe mechanical design, robot trajectory optimisation, control algorithm development, and results from phantom experiments compared with both robotic reaming and conventional reaming. The system was also tested in a cadaver experiment.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Results</h3>\u0000 \u0000 <p>The proposed robotic burring system can produce a surface in 2 min with an average error of 0.1 and 0.18 mm, when cutting polyurethane bone block #15 and #30, respectively. The performance was better than robotic reaming and conventional hand reaming.</p>\u0000 </section>\u0000 \u0000 <section>\u0000 \u0000 <h3> Conclusion</h3>\u0000 \u0000 <p>The proposed robotic burring system outperformed robotic and conventional reaming methods to produce an accurate acetabular cavity. The findings show the potential usage of a robotic-assisted burring in THR for acetabular preparation.</p>\u0000 </section>\u0000 </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2023-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/rcs.2615","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139069739","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}