Surgical Phase Recognition with Wearable Video Camera for Computer-aided Orthopaedic Surgery-AI Navigation System

IF 0.4 Q4 ENGINEERING, INDUSTRIAL
Shoichi Nishio, B. Hossain, M. Nii, N. Yagi, T. Hiranaka, Syoji Kobashi
{"title":"Surgical Phase Recognition with Wearable Video Camera for Computer-aided Orthopaedic Surgery-AI Navigation System","authors":"Shoichi Nishio, B. Hossain, M. Nii, N. Yagi, T. Hiranaka, Syoji Kobashi","doi":"10.5057/ijae.ijae-d-19-00018","DOIUrl":null,"url":null,"abstract":"The procedure of orthopaedic surgery is quite complicated, and many kinds of equipment have been used. Operating room nurses who deliver surgical instruments to surgeon are supposed to be forced to incur a heavy burden. This study aims to offer a computer-aided orthopaedic surgery (CAOS)-AI navigation system, which assists operating room nurses by suggesting the current progress of the procedure and expected surgical instruments. This paper proposes a method for recognizing the current phase of orthopaedic procedures from surgeon-wearable video camera images. The method plays the fundamental role of CAOS-AI navigation system. The proposed method is based on a convolutional-long short-term memory (LSTM) network. We also investigate the efficient CNN model in some competitive models such as VGG16, DenseNet, and ResNet to improve the recognition accuracy. Experimental results in unicomapartmenatal knee arthroplasty (UKA) surgeries showed that the proposed method achieved a phase recognition accuracy with 48.2%, 41.2%, and 53.6% using VGG16, DenseNet, and ResNet, respectively.","PeriodicalId":41579,"journal":{"name":"International Journal of Affective Engineering","volume":"1 1","pages":""},"PeriodicalIF":0.4000,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Affective Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5057/ijae.ijae-d-19-00018","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENGINEERING, INDUSTRIAL","Score":null,"Total":0}
引用次数: 1

Abstract

The procedure of orthopaedic surgery is quite complicated, and many kinds of equipment have been used. Operating room nurses who deliver surgical instruments to surgeon are supposed to be forced to incur a heavy burden. This study aims to offer a computer-aided orthopaedic surgery (CAOS)-AI navigation system, which assists operating room nurses by suggesting the current progress of the procedure and expected surgical instruments. This paper proposes a method for recognizing the current phase of orthopaedic procedures from surgeon-wearable video camera images. The method plays the fundamental role of CAOS-AI navigation system. The proposed method is based on a convolutional-long short-term memory (LSTM) network. We also investigate the efficient CNN model in some competitive models such as VGG16, DenseNet, and ResNet to improve the recognition accuracy. Experimental results in unicomapartmenatal knee arthroplasty (UKA) surgeries showed that the proposed method achieved a phase recognition accuracy with 48.2%, 41.2%, and 53.6% using VGG16, DenseNet, and ResNet, respectively.
用于计算机辅助骨科手术的可穿戴摄像机手术相位识别-人工智能导航系统
骨科手术的程序相当复杂,需要使用多种器械。向外科医生运送手术器械的手术室护士应该被迫承担沉重的负担。本研究旨在提供一个计算机辅助骨科手术(CAOS)-AI导航系统,该系统可以通过提示手术的当前进展和预期的手术器械来辅助手术室护士。本文提出了一种从外科可穿戴摄像机图像中识别骨科手术当前阶段的方法。该方法是CAOS-AI导航系统的基础。该方法基于卷积-长短时记忆(LSTM)网络。我们还在VGG16、DenseNet和ResNet等竞争模型中研究了高效的CNN模型,以提高识别精度。在单房膝关节置换术(UKA)手术中的实验结果表明,采用VGG16、DenseNet和ResNet,本文方法的相位识别准确率分别为48.2%、41.2%和53.6%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
33.30%
发文量
18
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信