目的:基于机器人辅助根治性前列腺切除术中解剖自动识别和暴露时间的手术技能评估。

IF 2.1 3区 医学 Q2 SURGERY
Kodai Sato, Shin Takenaka, Daichi Kitaguchi, Xue Zhao, Atsushi Yamada, Yuto Ishikawa, Nobushige Takeshita, Nobuyoshi Takeshita, Shinichi Sakamoto, Tomohiko Ichikawa, Masaaki Ito
{"title":"目的:基于机器人辅助根治性前列腺切除术中解剖自动识别和暴露时间的手术技能评估。","authors":"Kodai Sato, Shin Takenaka, Daichi Kitaguchi, Xue Zhao, Atsushi Yamada, Yuto Ishikawa, Nobushige Takeshita, Nobuyoshi Takeshita, Shinichi Sakamoto, Tomohiko Ichikawa, Masaaki Ito","doi":"10.1007/s00423-024-03598-0","DOIUrl":null,"url":null,"abstract":"<p><strong>Purpose: </strong>Assessing surgical skills is vital for training surgeons, but creating objective, automated evaluation systems is challenging, especially in robotic surgery. Surgical procedures generally involve dissection and exposure (D/E), and their duration and proportion can be used for skill assessment. This study aimed to develop an AI model to acquire D/E parameters in robot-assisted radical prostatectomy (RARP) and verify if these parameters could distinguish between novice and expert surgeons.</p><p><strong>Methods: </strong>This retrospective study used 209 RARP videos from 18 Japanese institutions. Dissection time was defined as the duration of forceps energy activation, and exposure time as the combined duration of manipulating the third arm and camera. To measure these times, an AI-based interface recognition model was developed to automatically extract instrument status from the da Vinci Surgical System<sup>®</sup> UI. We compared novices and experts by measuring dissection and exposure times from the model's output.</p><p><strong>Results: </strong>The overall accuracies of the UI recognition model for recognizing the forceps type, energy activation status, and camera usage status were 0.991, 0.998, and 0.991, respectively. Dissection time was 45.2 vs. 35.1 s (novice vs. expert, p = 0.374), exposure time was 195.7 vs. 89.7 s (novice vs. expert, p < 0.001), and the D/E ratio was 0.174 vs. 0.315 (novice vs. expert, p = 0.003).</p><p><strong>Conclusions: </strong>We successfully developed a model to automatically acquire dissection and exposure parameters for RARP. Exposure time may serve as an objective parameter to distinguish between novices and experts in RARP, and automated technical evaluation in RARP is feasible.</p><p><strong>Trial registration number and date: </strong>This study was approved by the Institutional Review Board of the National Cancer Center Hospital East (No.2020 - 329) on January 28, 2021.</p>","PeriodicalId":17983,"journal":{"name":"Langenbeck's Archives of Surgery","volume":"410 1","pages":"39"},"PeriodicalIF":2.1000,"publicationDate":"2025-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11735544/pdf/","citationCount":"0","resultStr":"{\"title\":\"Objective surgical skill assessment based on automatic recognition of dissection and exposure times in robot-assisted radical prostatectomy.\",\"authors\":\"Kodai Sato, Shin Takenaka, Daichi Kitaguchi, Xue Zhao, Atsushi Yamada, Yuto Ishikawa, Nobushige Takeshita, Nobuyoshi Takeshita, Shinichi Sakamoto, Tomohiko Ichikawa, Masaaki Ito\",\"doi\":\"10.1007/s00423-024-03598-0\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Purpose: </strong>Assessing surgical skills is vital for training surgeons, but creating objective, automated evaluation systems is challenging, especially in robotic surgery. Surgical procedures generally involve dissection and exposure (D/E), and their duration and proportion can be used for skill assessment. This study aimed to develop an AI model to acquire D/E parameters in robot-assisted radical prostatectomy (RARP) and verify if these parameters could distinguish between novice and expert surgeons.</p><p><strong>Methods: </strong>This retrospective study used 209 RARP videos from 18 Japanese institutions. Dissection time was defined as the duration of forceps energy activation, and exposure time as the combined duration of manipulating the third arm and camera. To measure these times, an AI-based interface recognition model was developed to automatically extract instrument status from the da Vinci Surgical System<sup>®</sup> UI. We compared novices and experts by measuring dissection and exposure times from the model's output.</p><p><strong>Results: </strong>The overall accuracies of the UI recognition model for recognizing the forceps type, energy activation status, and camera usage status were 0.991, 0.998, and 0.991, respectively. Dissection time was 45.2 vs. 35.1 s (novice vs. expert, p = 0.374), exposure time was 195.7 vs. 89.7 s (novice vs. expert, p < 0.001), and the D/E ratio was 0.174 vs. 0.315 (novice vs. expert, p = 0.003).</p><p><strong>Conclusions: </strong>We successfully developed a model to automatically acquire dissection and exposure parameters for RARP. Exposure time may serve as an objective parameter to distinguish between novices and experts in RARP, and automated technical evaluation in RARP is feasible.</p><p><strong>Trial registration number and date: </strong>This study was approved by the Institutional Review Board of the National Cancer Center Hospital East (No.2020 - 329) on January 28, 2021.</p>\",\"PeriodicalId\":17983,\"journal\":{\"name\":\"Langenbeck's Archives of Surgery\",\"volume\":\"410 1\",\"pages\":\"39\"},\"PeriodicalIF\":2.1000,\"publicationDate\":\"2025-01-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11735544/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Langenbeck's Archives of Surgery\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1007/s00423-024-03598-0\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"SURGERY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Langenbeck's Archives of Surgery","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1007/s00423-024-03598-0","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"SURGERY","Score":null,"Total":0}
引用次数: 0

摘要

目的:评估外科手术技能对外科医生的培训至关重要,但创建客观、自动化的评估系统具有挑战性,特别是在机器人手术中。外科手术通常包括剥离和暴露(D/E),其持续时间和比例可用于技能评估。本研究旨在开发一个人工智能模型来获取机器人辅助根治性前列腺切除术(RARP)中的D/E参数,并验证这些参数是否可以区分新手和专家外科医生。方法:采用日本18家机构的209部RARP视频进行回顾性研究。解剖时间定义为钳能量激活的持续时间,曝光时间定义为操纵第三臂和相机的联合持续时间。为了测量这些时间,开发了基于人工智能的界面识别模型,以自动从达芬奇手术系统®UI中提取器械状态。我们通过测量模型输出的解剖和曝光时间来比较新手和专家。结果:UI识别模型对镊子类型、能量激活状态、相机使用状态的整体识别准确率分别为0.991、0.998、0.991。解剖时间为45.2 vs. 35.1 s(新手vs.专家,p = 0.374),暴露时间为195.7 vs. 89.7 s(新手vs.专家,p = 0.374)。结论:成功建立了RARP解剖和暴露参数自动获取模型。暴露时间可以作为区分RARP新手和专家的客观参数,对RARP进行自动化技术评价是可行的。试验注册号和日期:本研究于2021年1月28日获得国家癌症中心东医院机构审查委员会(No.2020 - 329)批准。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Objective surgical skill assessment based on automatic recognition of dissection and exposure times in robot-assisted radical prostatectomy.

Purpose: Assessing surgical skills is vital for training surgeons, but creating objective, automated evaluation systems is challenging, especially in robotic surgery. Surgical procedures generally involve dissection and exposure (D/E), and their duration and proportion can be used for skill assessment. This study aimed to develop an AI model to acquire D/E parameters in robot-assisted radical prostatectomy (RARP) and verify if these parameters could distinguish between novice and expert surgeons.

Methods: This retrospective study used 209 RARP videos from 18 Japanese institutions. Dissection time was defined as the duration of forceps energy activation, and exposure time as the combined duration of manipulating the third arm and camera. To measure these times, an AI-based interface recognition model was developed to automatically extract instrument status from the da Vinci Surgical System® UI. We compared novices and experts by measuring dissection and exposure times from the model's output.

Results: The overall accuracies of the UI recognition model for recognizing the forceps type, energy activation status, and camera usage status were 0.991, 0.998, and 0.991, respectively. Dissection time was 45.2 vs. 35.1 s (novice vs. expert, p = 0.374), exposure time was 195.7 vs. 89.7 s (novice vs. expert, p < 0.001), and the D/E ratio was 0.174 vs. 0.315 (novice vs. expert, p = 0.003).

Conclusions: We successfully developed a model to automatically acquire dissection and exposure parameters for RARP. Exposure time may serve as an objective parameter to distinguish between novices and experts in RARP, and automated technical evaluation in RARP is feasible.

Trial registration number and date: This study was approved by the Institutional Review Board of the National Cancer Center Hospital East (No.2020 - 329) on January 28, 2021.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
3.30
自引率
8.70%
发文量
342
审稿时长
4-8 weeks
期刊介绍: Langenbeck''s Archives of Surgery aims to publish the best results in the field of clinical surgery and basic surgical research. The main focus is on providing the highest level of clinical research and clinically relevant basic research. The journal, published exclusively in English, will provide an international discussion forum for the controlled results of clinical surgery. The majority of published contributions will be original articles reporting on clinical data from general and visceral surgery, while endocrine surgery will also be covered. Papers on basic surgical principles from the fields of traumatology, vascular and thoracic surgery are also welcome. Evidence-based medicine is an important criterion for the acceptance of papers.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信