{"title":"通过融合视觉和惯性测量数据,使结构从运动中变形","authors":"S. Giannarou, Zhiqiang Zhang, Guang-Zhong Yang","doi":"10.1109/IROS.2012.6385671","DOIUrl":null,"url":null,"abstract":"Accurate recovery of the 3D structure of a deforming surgical environment during minimally invasive surgery is important for intra-operative guidance. One key component of reliable reconstruction is accurate camera pose estimation, which is challenging for monocular cameras due to the paucity of reliable salient features, coupled with narrow baseline during surgical navigation. With recent advances in miniaturized MEMS sensors, the combination of inertial and vision sensing can provide increased robustness for camera pose estimation particularly for scenes involving tissue deformation. The aim of this work is to propose a robust framework for intra-operative free-form deformation recovery based on structure-from-motion. A novel adaptive Unscented Kalman Filter (UKF) parameterization scheme is proposed to fuse vision information with data from an Inertial Measurement Unit (IMU). The method is built on a compact scene representation scheme suitable for both surgical episode identification and instrument-tissue motion modelling. Detailed validation with both synthetic and phantom data is performed and results derived justify the potential clinical value of the technique.","PeriodicalId":6358,"journal":{"name":"2012 IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"13 1","pages":"4816-4821"},"PeriodicalIF":0.0000,"publicationDate":"2012-12-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"14","resultStr":"{\"title\":\"Deformable structure from motion by fusing visual and inertial measurement data\",\"authors\":\"S. Giannarou, Zhiqiang Zhang, Guang-Zhong Yang\",\"doi\":\"10.1109/IROS.2012.6385671\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Accurate recovery of the 3D structure of a deforming surgical environment during minimally invasive surgery is important for intra-operative guidance. One key component of reliable reconstruction is accurate camera pose estimation, which is challenging for monocular cameras due to the paucity of reliable salient features, coupled with narrow baseline during surgical navigation. With recent advances in miniaturized MEMS sensors, the combination of inertial and vision sensing can provide increased robustness for camera pose estimation particularly for scenes involving tissue deformation. The aim of this work is to propose a robust framework for intra-operative free-form deformation recovery based on structure-from-motion. A novel adaptive Unscented Kalman Filter (UKF) parameterization scheme is proposed to fuse vision information with data from an Inertial Measurement Unit (IMU). The method is built on a compact scene representation scheme suitable for both surgical episode identification and instrument-tissue motion modelling. Detailed validation with both synthetic and phantom data is performed and results derived justify the potential clinical value of the technique.\",\"PeriodicalId\":6358,\"journal\":{\"name\":\"2012 IEEE/RSJ International Conference on Intelligent Robots and Systems\",\"volume\":\"13 1\",\"pages\":\"4816-4821\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2012-12-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"14\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2012 IEEE/RSJ International Conference on Intelligent Robots and Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IROS.2012.6385671\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 IEEE/RSJ International Conference on Intelligent Robots and Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IROS.2012.6385671","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Deformable structure from motion by fusing visual and inertial measurement data
Accurate recovery of the 3D structure of a deforming surgical environment during minimally invasive surgery is important for intra-operative guidance. One key component of reliable reconstruction is accurate camera pose estimation, which is challenging for monocular cameras due to the paucity of reliable salient features, coupled with narrow baseline during surgical navigation. With recent advances in miniaturized MEMS sensors, the combination of inertial and vision sensing can provide increased robustness for camera pose estimation particularly for scenes involving tissue deformation. The aim of this work is to propose a robust framework for intra-operative free-form deformation recovery based on structure-from-motion. A novel adaptive Unscented Kalman Filter (UKF) parameterization scheme is proposed to fuse vision information with data from an Inertial Measurement Unit (IMU). The method is built on a compact scene representation scheme suitable for both surgical episode identification and instrument-tissue motion modelling. Detailed validation with both synthetic and phantom data is performed and results derived justify the potential clinical value of the technique.