Bing-Yuan Huang, Jie Li, Yu-Chen Lin, Yuan-Hsiang Lin
{"title":"一种用于腹腔镜手术训练的基于图像的器械空间定位器","authors":"Bing-Yuan Huang, Jie Li, Yu-Chen Lin, Yuan-Hsiang Lin","doi":"10.1109/ICSSE.2016.7551615","DOIUrl":null,"url":null,"abstract":"In the past few years, laparoscopic surgery has been widely performed in abdomen or pelvis surgery. It is a minimally invasive surgery (MIS) technique to conduct the operation only through small incisions. The advantages of MIS include smaller incisions, less bleeding and shorter recovery time. However, surgeons can only perform laparoscopic surgery through the video from a laparoscope and surgical instruments. This may increase the complication and difficulties while fulfilling MIS. Therefore, before conducting the real operation, surgeons are required to receive multiple related trainings to improve their surgical skills, especially in stabilization and hand-eye coordination. These skills can be estimated by analyzing the paths of the surgical instruments while simulating the operation processes. Besides, the locating information of surgical instruments can be utilized for simulating virtual reality training applications. Herein, the objective of this paper is to develop an image-based instruments locator. The system comprises a stereo camera, two surgical instruments with passive markers, and a FPGA development board. The instant location of the markers can be determined based on proposed image processing framework. The errors of the provided locating method can reach to 0.21±0.17 cm in X axis, 0.10±0.10 cm in Y axis, and 0.29±0.22 cm in Z axis.","PeriodicalId":175283,"journal":{"name":"2016 International Conference on System Science and Engineering (ICSSE)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An image-based instruments space locator for laparoscopic surgery training\",\"authors\":\"Bing-Yuan Huang, Jie Li, Yu-Chen Lin, Yuan-Hsiang Lin\",\"doi\":\"10.1109/ICSSE.2016.7551615\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In the past few years, laparoscopic surgery has been widely performed in abdomen or pelvis surgery. It is a minimally invasive surgery (MIS) technique to conduct the operation only through small incisions. The advantages of MIS include smaller incisions, less bleeding and shorter recovery time. However, surgeons can only perform laparoscopic surgery through the video from a laparoscope and surgical instruments. This may increase the complication and difficulties while fulfilling MIS. Therefore, before conducting the real operation, surgeons are required to receive multiple related trainings to improve their surgical skills, especially in stabilization and hand-eye coordination. These skills can be estimated by analyzing the paths of the surgical instruments while simulating the operation processes. Besides, the locating information of surgical instruments can be utilized for simulating virtual reality training applications. Herein, the objective of this paper is to develop an image-based instruments locator. The system comprises a stereo camera, two surgical instruments with passive markers, and a FPGA development board. The instant location of the markers can be determined based on proposed image processing framework. The errors of the provided locating method can reach to 0.21±0.17 cm in X axis, 0.10±0.10 cm in Y axis, and 0.29±0.22 cm in Z axis.\",\"PeriodicalId\":175283,\"journal\":{\"name\":\"2016 International Conference on System Science and Engineering (ICSSE)\",\"volume\":\"30 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-07-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 International Conference on System Science and Engineering (ICSSE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICSSE.2016.7551615\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 International Conference on System Science and Engineering (ICSSE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICSSE.2016.7551615","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An image-based instruments space locator for laparoscopic surgery training
In the past few years, laparoscopic surgery has been widely performed in abdomen or pelvis surgery. It is a minimally invasive surgery (MIS) technique to conduct the operation only through small incisions. The advantages of MIS include smaller incisions, less bleeding and shorter recovery time. However, surgeons can only perform laparoscopic surgery through the video from a laparoscope and surgical instruments. This may increase the complication and difficulties while fulfilling MIS. Therefore, before conducting the real operation, surgeons are required to receive multiple related trainings to improve their surgical skills, especially in stabilization and hand-eye coordination. These skills can be estimated by analyzing the paths of the surgical instruments while simulating the operation processes. Besides, the locating information of surgical instruments can be utilized for simulating virtual reality training applications. Herein, the objective of this paper is to develop an image-based instruments locator. The system comprises a stereo camera, two surgical instruments with passive markers, and a FPGA development board. The instant location of the markers can be determined based on proposed image processing framework. The errors of the provided locating method can reach to 0.21±0.17 cm in X axis, 0.10±0.10 cm in Y axis, and 0.29±0.22 cm in Z axis.