{"title":"沉浸式增强现实系统在超声引导干预期间协助针头定位","authors":"P. Kanithi, J. Chatterjee, D. Sheet","doi":"10.1145/3009977.3010023","DOIUrl":null,"url":null,"abstract":"Ultrasound (US) guided intervention is a surgical procedure where the clinician makes use of imaging in realtime, to track the position of the needle, and correct its trajectory for accurately steering it to the lesion of interest. However, the needle is visible in the US image, only when aligned in-plane with the scanning plane of the US probe. In practice, clinicians often use a mechanical needle guide, thus restricting their available degrees of freedom in the US probe movement. Alternatively, during free-hand procedure, they use multiple needle punctures to achieve this in-plane positioning. Our present work details an augmented reality (AR) system for patient comfort centric aid to needle intervention through an overlaid visualization of the needle trajectory on the US frame prior to its insertion. This is implemented by continuous visual tracking of the US probe and the needle in 3D world coordinate system using fiducial markers. The tracked marker positions are used to draw the needle trajectory and tip visualized in realtime to augment on the US feed. Subsequently, the continuously tracked US probe and needle, and the navigation assistance information, would be overlaid with the visual feed from a head mounted display (HMD) for generating totally immersive AR experience for the clinician.","PeriodicalId":93806,"journal":{"name":"Proceedings. Indian Conference on Computer Vision, Graphics & Image Processing","volume":"72 1","pages":"65:1-65:8"},"PeriodicalIF":0.0000,"publicationDate":"2016-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Immersive augmented reality system for assisting needle positioning during ultrasound guided intervention\",\"authors\":\"P. Kanithi, J. Chatterjee, D. Sheet\",\"doi\":\"10.1145/3009977.3010023\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Ultrasound (US) guided intervention is a surgical procedure where the clinician makes use of imaging in realtime, to track the position of the needle, and correct its trajectory for accurately steering it to the lesion of interest. However, the needle is visible in the US image, only when aligned in-plane with the scanning plane of the US probe. In practice, clinicians often use a mechanical needle guide, thus restricting their available degrees of freedom in the US probe movement. Alternatively, during free-hand procedure, they use multiple needle punctures to achieve this in-plane positioning. Our present work details an augmented reality (AR) system for patient comfort centric aid to needle intervention through an overlaid visualization of the needle trajectory on the US frame prior to its insertion. This is implemented by continuous visual tracking of the US probe and the needle in 3D world coordinate system using fiducial markers. The tracked marker positions are used to draw the needle trajectory and tip visualized in realtime to augment on the US feed. Subsequently, the continuously tracked US probe and needle, and the navigation assistance information, would be overlaid with the visual feed from a head mounted display (HMD) for generating totally immersive AR experience for the clinician.\",\"PeriodicalId\":93806,\"journal\":{\"name\":\"Proceedings. Indian Conference on Computer Vision, Graphics & Image Processing\",\"volume\":\"72 1\",\"pages\":\"65:1-65:8\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-12-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings. Indian Conference on Computer Vision, Graphics & Image Processing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3009977.3010023\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. Indian Conference on Computer Vision, Graphics & Image Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3009977.3010023","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Immersive augmented reality system for assisting needle positioning during ultrasound guided intervention
Ultrasound (US) guided intervention is a surgical procedure where the clinician makes use of imaging in realtime, to track the position of the needle, and correct its trajectory for accurately steering it to the lesion of interest. However, the needle is visible in the US image, only when aligned in-plane with the scanning plane of the US probe. In practice, clinicians often use a mechanical needle guide, thus restricting their available degrees of freedom in the US probe movement. Alternatively, during free-hand procedure, they use multiple needle punctures to achieve this in-plane positioning. Our present work details an augmented reality (AR) system for patient comfort centric aid to needle intervention through an overlaid visualization of the needle trajectory on the US frame prior to its insertion. This is implemented by continuous visual tracking of the US probe and the needle in 3D world coordinate system using fiducial markers. The tracked marker positions are used to draw the needle trajectory and tip visualized in realtime to augment on the US feed. Subsequently, the continuously tracked US probe and needle, and the navigation assistance information, would be overlaid with the visual feed from a head mounted display (HMD) for generating totally immersive AR experience for the clinician.