Brian C. Lee, Ayushi Sinha, N. Varble, W. Pritchard, J. Karanian, B. Wood, T. Bydlon
{"title":"肺ct透视登记中实时c臂姿态估计的呼吸补偿神经网络","authors":"Brian C. Lee, Ayushi Sinha, N. Varble, W. Pritchard, J. Karanian, B. Wood, T. Bydlon","doi":"10.1109/ISBI52829.2022.9761705","DOIUrl":null,"url":null,"abstract":"Augmentation of interventional c-arm fluoroscopy using information extracted from pre-operative imaging has the potential to reduce procedure times and improve patient outcomes in minimally invasive peripheral lung procedures, where breathing motion, small airways, and anatomical variation create a challenging environment for planned pathway navigation. Extraction of the rigid c-arm pose relative to preoperative images is a crucial prerequisite; however, accurate 2D-3D fluoroscopy-CT soft tissue registration in the presence of natural deformable patient motion remains challenging. We propose to train a patient-specific neural network on synthetic fluoroscopy derived from the patient’s pre-operative CT, augmented by a generalized breathing motion model, to predict c-arm pose. Our model includes an image supervision path that infers the x-ray projection geometry, providing training stability across patients. We train our model on synthetic fluoroscopy generated from preclinical swine CT and we evaluate on synthetic and real fluoroscopy.","PeriodicalId":6827,"journal":{"name":"2022 IEEE 19th International Symposium on Biomedical Imaging (ISBI)","volume":"25 1","pages":"1-5"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Breathing-Compensated Neural Networks for Real Time C-Arm Pose Estimation in Lung CT-Fluoroscopy Registration\",\"authors\":\"Brian C. Lee, Ayushi Sinha, N. Varble, W. Pritchard, J. Karanian, B. Wood, T. Bydlon\",\"doi\":\"10.1109/ISBI52829.2022.9761705\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Augmentation of interventional c-arm fluoroscopy using information extracted from pre-operative imaging has the potential to reduce procedure times and improve patient outcomes in minimally invasive peripheral lung procedures, where breathing motion, small airways, and anatomical variation create a challenging environment for planned pathway navigation. Extraction of the rigid c-arm pose relative to preoperative images is a crucial prerequisite; however, accurate 2D-3D fluoroscopy-CT soft tissue registration in the presence of natural deformable patient motion remains challenging. We propose to train a patient-specific neural network on synthetic fluoroscopy derived from the patient’s pre-operative CT, augmented by a generalized breathing motion model, to predict c-arm pose. Our model includes an image supervision path that infers the x-ray projection geometry, providing training stability across patients. We train our model on synthetic fluoroscopy generated from preclinical swine CT and we evaluate on synthetic and real fluoroscopy.\",\"PeriodicalId\":6827,\"journal\":{\"name\":\"2022 IEEE 19th International Symposium on Biomedical Imaging (ISBI)\",\"volume\":\"25 1\",\"pages\":\"1-5\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-03-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE 19th International Symposium on Biomedical Imaging (ISBI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISBI52829.2022.9761705\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 19th International Symposium on Biomedical Imaging (ISBI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISBI52829.2022.9761705","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Breathing-Compensated Neural Networks for Real Time C-Arm Pose Estimation in Lung CT-Fluoroscopy Registration
Augmentation of interventional c-arm fluoroscopy using information extracted from pre-operative imaging has the potential to reduce procedure times and improve patient outcomes in minimally invasive peripheral lung procedures, where breathing motion, small airways, and anatomical variation create a challenging environment for planned pathway navigation. Extraction of the rigid c-arm pose relative to preoperative images is a crucial prerequisite; however, accurate 2D-3D fluoroscopy-CT soft tissue registration in the presence of natural deformable patient motion remains challenging. We propose to train a patient-specific neural network on synthetic fluoroscopy derived from the patient’s pre-operative CT, augmented by a generalized breathing motion model, to predict c-arm pose. Our model includes an image supervision path that infers the x-ray projection geometry, providing training stability across patients. We train our model on synthetic fluoroscopy generated from preclinical swine CT and we evaluate on synthetic and real fluoroscopy.