Journal of Medical Imaging最新文献

筛选
英文 中文
Special Section Guest Editorial: Introduction to the JMI Special Section on Augmented and Virtual Reality in Medical Imaging. 特邀编辑:JMI医学影像增强和虚拟现实特邀部分简介。
IF 1.9
Journal of Medical Imaging Pub Date : 2024-11-01 Epub Date: 2024-12-27 DOI: 10.1117/1.JMI.11.6.062601
Ryan Beams, Raj Shekhar
{"title":"Special Section Guest Editorial: Introduction to the JMI Special Section on Augmented and Virtual Reality in Medical Imaging.","authors":"Ryan Beams, Raj Shekhar","doi":"10.1117/1.JMI.11.6.062601","DOIUrl":"https://doi.org/10.1117/1.JMI.11.6.062601","url":null,"abstract":"<p><p>The editorial introduces the JMI Special Section on Augmented and Virtual Reality in Medical Imaging.</p>","PeriodicalId":47707,"journal":{"name":"Journal of Medical Imaging","volume":"11 6","pages":"062601"},"PeriodicalIF":1.9,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11671691/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142903875","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
SCC-NET: segmentation of clinical cancer image for head and neck squamous cell carcinoma. SCC-NET:头颈部鳞状细胞癌的临床癌症图像分割。
IF 1.9
Journal of Medical Imaging Pub Date : 2024-11-01 Epub Date: 2024-11-21 DOI: 10.1117/1.JMI.11.6.065501
Chien-Yu Huang, Cheng-Che Tsai, Lisa Alice Hwang, Bor-Hwang Kang, Yaoh-Shiang Lin, Hsing-Hao Su, Guan-Ting Shen, Jun-Wei Hsieh
{"title":"SCC-NET: segmentation of clinical cancer image for head and neck squamous cell carcinoma.","authors":"Chien-Yu Huang, Cheng-Che Tsai, Lisa Alice Hwang, Bor-Hwang Kang, Yaoh-Shiang Lin, Hsing-Hao Su, Guan-Ting Shen, Jun-Wei Hsieh","doi":"10.1117/1.JMI.11.6.065501","DOIUrl":"10.1117/1.JMI.11.6.065501","url":null,"abstract":"<p><strong>Purpose: </strong>Squamous cell carcinoma (SCC) accounts for 90% of head and neck cancer. The majority of cases can be diagnosed and even treated with endoscopic examination and surgery. Deep learning models have been adopted for various medical endoscopy exams. However, few reports have been on deep learning algorithms for segmenting head and neck SCC.</p><p><strong>Approach: </strong>Head and neck SCC pre-treatment endoscopic images during 2016-2020 were collected from the Kaohsiung Veterans General Hospital Department of Otolaryngology-Head and Neck Surgery. We present a new modification of the neural architecture search-U-Net-based model called SCC-Net for segmenting our enrolled endoscopic photos. The modification included a new technique called \"Learnable Discrete Wavelet Pooling\" to design a new formulation that combines the outputs of different layers using a channel attention module and assigns weights based on their importance in the information flow. We also incorporated the cross-stage-partial design from CSPnet. The performance was compared with other eight state-of-the-art image segmentation models.</p><p><strong>Results: </strong>We collected a total of 556 pathologically confirmed SCC photos. The new SCC-Net algorithm achieves a high mean intersection over union (mIOU) of 87.2%, accuracy of 97.17%, and recall of 97.15%. When comparing the performance of our proposed model with eight different state-of-the-art image segmentation artificial neural network models, our model performed best in mIOU, Dice similarity coefficient, accuracy, and recall.</p><p><strong>Conclusions: </strong>Our proposed SCC-Net architecture was able to successfully segment lesions from white light endoscopic images with promising accuracy, with a single model performing well in all upper aerodigestive tracts.</p>","PeriodicalId":47707,"journal":{"name":"Journal of Medical Imaging","volume":"11 6","pages":"065501"},"PeriodicalIF":1.9,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11579920/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142710748","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Utilization of double contrast-enhancement boost for lower-extremity CT angiography. 双增强增强在下肢CT血管造影中的应用。
IF 1.9
Journal of Medical Imaging Pub Date : 2024-11-01 Epub Date: 2024-12-04 DOI: 10.1117/1.JMI.11.6.067001
Chuluunbaatar Otgonbaatar, Jae-Kyun Ryu, Won Beom Jung, Seon Woong Jang, Sungjun Hwang, Taehyung Kim, Hackjoon Shim, Jung Wook Seo
{"title":"Utilization of double contrast-enhancement boost for lower-extremity CT angiography.","authors":"Chuluunbaatar Otgonbaatar, Jae-Kyun Ryu, Won Beom Jung, Seon Woong Jang, Sungjun Hwang, Taehyung Kim, Hackjoon Shim, Jung Wook Seo","doi":"10.1117/1.JMI.11.6.067001","DOIUrl":"10.1117/1.JMI.11.6.067001","url":null,"abstract":"<p><strong>Purpose: </strong>We aimed to compare the efficacy of the double contrast enhancement (CE)-boost technique with that of conventional methods to improve vascular contrast attenuation in lower-extremity computed tomography (CT) angiography.</p><p><strong>Approach: </strong>This retrospective study enrolled 45 patients (mean age, 70 years; range, 26 to 90 years; 30 males). To generate the CE-boost image, the degree of CE was determined by subtracting the post-contrast CT images from the pre-contrast CT images. The double CE-boost technique involves the application of this CE process twice. Both objective assessments (CT attenuation, noise level, signal-to-noise ratio [SNR], contrast-to-noise ratio [CNR], and image sharpness) and subjective quality evaluations were conducted on three types of images (conventional, CE-boost, and double CE-boost images).</p><p><strong>Results: </strong>Double CE-boost images demonstrated significantly reduced noise in Hounsfield units (HUs) compared with conventional and CE-boost images ( <math><mrow><mi>p</mi> <mo><</mo> <mn>0.001</mn></mrow> </math> ). CT attenuation values (HUs) were substantially higher in all different locations of the lower extremity with double CE-boost images ( <math><mrow><mn>834.49</mn> <mo>±</mo> <mn>140.73</mn></mrow> </math> ), as opposed to conventional ( <math><mrow><mn>399.63</mn> <mo>±</mo> <mn>62.01</mn></mrow> </math> ) and CE-boost images ( <math><mrow><mn>572.66</mn> <mo>±</mo> <mn>93.61</mn></mrow> </math> ). The SNR and CNR were notably improved in the double CE-boost image compared with both conventional and CE-boost images. Image sharpness analysis of the popliteal artery ( <math><mrow><mi>p</mi> <mo>=</mo> <mn>0.828</mn></mrow> </math> ), anterior tibial artery ( <math><mrow><mi>p</mi> <mo>=</mo> <mn>0.671</mn></mrow> </math> ), and dorsalis pedis artery ( <math><mrow><mi>p</mi> <mo>=</mo> <mn>0.281</mn></mrow> </math> ) revealed consistency across conventional, CE-boost, and double CE-boost images. Subjective image analysis indicated superior ratings for the double CE-boost compared with other types.</p><p><strong>Conclusions: </strong>The implementation of the double CE-boost technique improves image quality by decreasing image noise, increasing CT attenuation, and improving SNR, CNR, and subjective assessment compared with CE-boost and conventional imaging.</p>","PeriodicalId":47707,"journal":{"name":"Journal of Medical Imaging","volume":"11 6","pages":"067001"},"PeriodicalIF":1.9,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11614588/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142781069","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Augmented reality for point-of-care ultrasound-guided vascular access in pediatric patients using Microsoft HoloLens 2: a preliminary evaluation. 使用 Microsoft HoloLens 2 对儿科患者进行护理点超声引导血管通路的增强现实技术:初步评估。
IF 1.9
Journal of Medical Imaging Pub Date : 2024-11-01 Epub Date: 2024-09-13 DOI: 10.1117/1.JMI.11.6.062604
Gesiren Zhang, Trong N Nguyen, Hadi Fooladi-Talari, Tyler Salvador, Kia Thomas, Daragh Crowley, R Scott Dingeman, Raj Shekhar
{"title":"Augmented reality for point-of-care ultrasound-guided vascular access in pediatric patients using Microsoft HoloLens 2: a preliminary evaluation.","authors":"Gesiren Zhang, Trong N Nguyen, Hadi Fooladi-Talari, Tyler Salvador, Kia Thomas, Daragh Crowley, R Scott Dingeman, Raj Shekhar","doi":"10.1117/1.JMI.11.6.062604","DOIUrl":"https://doi.org/10.1117/1.JMI.11.6.062604","url":null,"abstract":"<p><strong>Significance: </strong>Conventional ultrasound-guided vascular access procedures are challenging due to the need for anatomical understanding, precise needle manipulation, and hand-eye coordination. Recently, augmented reality (AR)-based guidance has emerged as an aid to improve procedural efficiency and potential outcomes. However, its application in pediatric vascular access has not been comprehensively evaluated.</p><p><strong>Aim: </strong>We developed an AR ultrasound application, HoloUS, using the Microsoft HoloLens 2 to display live ultrasound images directly in the proceduralist's field of view. We presented our evaluation of the effect of using the Microsoft HoloLens 2 for point-of-care ultrasound (POCUS)-guided vascular access in 30 pediatric patients.</p><p><strong>Approach: </strong>A custom software module was developed on a tablet capable of capturing the moving ultrasound image from any ultrasound machine's screen. The captured image was compressed and sent to the HoloLens 2 via a hotspot without needing Internet access. On the HoloLens 2, we developed a custom software module to receive, decompress, and display the live ultrasound image. Hand gesture and voice command features were implemented for the user to reposition, resize, and change the gain and the contrast of the image. We evaluated 30 (15 successful control and 12 successful interventional) cases completed in a single-center, prospective, randomized study.</p><p><strong>Results: </strong>The mean overall rendering latency and the rendering frame rate of the HoloUS application were 139.30 ms <math><mrow><mo>(</mo> <mi>σ</mi> <mo>=</mo> <mn>32.02</mn> <mtext>  </mtext> <mi>ms</mi> <mo>)</mo></mrow> </math> and 30 frames per second, respectively. The average procedure completion time was 17.3% shorter using AR guidance. The numbers of puncture attempts and needle redirections were similar between the two groups, and the number of head adjustments was minimal in the interventional group.</p><p><strong>Conclusion: </strong>We presented our evaluation of the results from the first study using the Microsoft HoloLens 2 that investigates AR-based POCUS-guided vascular access in pediatric patients. Our evaluation confirmed clinical feasibility and potential improvement in procedural efficiency.</p>","PeriodicalId":47707,"journal":{"name":"Journal of Medical Imaging","volume":"11 6","pages":"062604"},"PeriodicalIF":1.9,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11393663/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142298700","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Super-resolution multi-contrast unbiased eye atlases with deep probabilistic refinement. 具有深度概率细化功能的超分辨率多对比度无偏眼图。
IF 1.9
Journal of Medical Imaging Pub Date : 2024-11-01 Epub Date: 2024-11-14 DOI: 10.1117/1.JMI.11.6.064004
Ho Hin Lee, Adam M Saunders, Michael E Kim, Samuel W Remedios, Lucas W Remedios, Yucheng Tang, Qi Yang, Xin Yu, Shunxing Bao, Chloe Cho, Louise A Mawn, Tonia S Rex, Kevin L Schey, Blake E Dewey, Jeffrey M Spraggins, Jerry L Prince, Yuankai Huo, Bennett A Landman
{"title":"Super-resolution multi-contrast unbiased eye atlases with deep probabilistic refinement.","authors":"Ho Hin Lee, Adam M Saunders, Michael E Kim, Samuel W Remedios, Lucas W Remedios, Yucheng Tang, Qi Yang, Xin Yu, Shunxing Bao, Chloe Cho, Louise A Mawn, Tonia S Rex, Kevin L Schey, Blake E Dewey, Jeffrey M Spraggins, Jerry L Prince, Yuankai Huo, Bennett A Landman","doi":"10.1117/1.JMI.11.6.064004","DOIUrl":"10.1117/1.JMI.11.6.064004","url":null,"abstract":"<p><strong>Purpose: </strong>Eye morphology varies significantly across the population, especially for the orbit and optic nerve. These variations limit the feasibility and robustness of generalizing population-wise features of eye organs to an unbiased spatial reference.</p><p><strong>Approach: </strong>To tackle these limitations, we propose a process for creating high-resolution unbiased eye atlases. First, to restore spatial details from scans with a low through-plane resolution compared with a high in-plane resolution, we apply a deep learning-based super-resolution algorithm. Then, we generate an initial unbiased reference with an iterative metric-based registration using a small portion of subject scans. We register the remaining scans to this template and refine the template using an unsupervised deep probabilistic approach that generates a more expansive deformation field to enhance the organ boundary alignment. We demonstrate this framework using magnetic resonance images across four different tissue contrasts, generating four atlases in separate spatial alignments.</p><p><strong>Results: </strong>When refining the template with sufficient subjects, we find a significant improvement using the Wilcoxon signed-rank test in the average Dice score across four labeled regions compared with a standard registration framework consisting of rigid, affine, and deformable transformations. These results highlight the effective alignment of eye organs and boundaries using our proposed process.</p><p><strong>Conclusions: </strong>By combining super-resolution preprocessing and deep probabilistic models, we address the challenge of generating an eye atlas to serve as a standardized reference across a largely variable population.</p>","PeriodicalId":47707,"journal":{"name":"Journal of Medical Imaging","volume":"11 6","pages":"064004"},"PeriodicalIF":1.9,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11561295/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142649317","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Advanced soft tissue visualization in conjunction with bone structures using contrast-enhanced micro-CT. 利用对比增强微型计算机断层扫描技术,结合骨骼结构进行先进的软组织可视化。
IF 1.9
Journal of Medical Imaging Pub Date : 2024-11-01 Epub Date: 2024-11-22 DOI: 10.1117/1.JMI.11.6.066001
Torben Hildebrand, Qianli Ma, Catherine A Heyward, Håvard J Haugen, Liebert P Nogueira
{"title":"Advanced soft tissue visualization in conjunction with bone structures using contrast-enhanced micro-CT.","authors":"Torben Hildebrand, Qianli Ma, Catherine A Heyward, Håvard J Haugen, Liebert P Nogueira","doi":"10.1117/1.JMI.11.6.066001","DOIUrl":"10.1117/1.JMI.11.6.066001","url":null,"abstract":"<p><strong>Purpose: </strong>Micro-computed tomography (CT) analysis of soft tissues alongside bone remains challenging due to significant differences in X-ray absorption, preventing spatial inspection of bone remodeling including the cellular intricacies of mineralized tissues in developmental biology and pathology. The goal was to develop a protocol for contrast-enhanced micro-CT imaging that effectively visualizes soft tissues and cells in conjunction with bone while minimizing bone attenuation by decalcification.</p><p><strong>Approach: </strong>Murine femur samples were decalcified in ethylenediaminetetraacetic acid and treated with three different contrast agents: (i) iodine in ethanol, (ii) phosphotungstic acid in water, and (iii) Lugol's iodine. Micro-CT scans were performed in the laboratory setup SkyScan 1172 and at the synchrotron radiation for medical physics beamline in synchrotron radiation facility Elettra. Soft and hard tissue contrast-to-noise ratio (CNR) and contrast efficiency after decalcification were measured.</p><p><strong>Results: </strong>In laboratory micro-CT, Lugol's iodine demonstrated a threefold higher CNR in the bone marrow, representing the soft tissue portion, compared with the bone. Contrast efficiencies, measured in synchrotron micro-CT, were consistent with these findings. Higher resolutions and the specificity of Lugol's iodine to cellular structures enabled detailed visualization of bone-forming cells in the epiphyseal plate.</p><p><strong>Conclusions: </strong>The combination of decalcification and the utilization of the contrast agent Lugol's iodine facilitated an enhanced soft tissue visualization in conjunction with bone.</p>","PeriodicalId":47707,"journal":{"name":"Journal of Medical Imaging","volume":"11 6","pages":"066001"},"PeriodicalIF":1.9,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11584031/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142710742","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Reflecting on a Year of Growth and Opportunity. 反思一年的成长和机遇。
IF 1.9
Journal of Medical Imaging Pub Date : 2024-11-01 Epub Date: 2024-12-20 DOI: 10.1117/1.JMI.11.6.060101
Bennett Landman
{"title":"Reflecting on a Year of Growth and Opportunity.","authors":"Bennett Landman","doi":"10.1117/1.JMI.11.6.060101","DOIUrl":"10.1117/1.JMI.11.6.060101","url":null,"abstract":"<p><p>The editorial reflects on the past year, highlighting impactful research and discussing challenges.</p>","PeriodicalId":47707,"journal":{"name":"Journal of Medical Imaging","volume":"11 6","pages":"060101"},"PeriodicalIF":1.9,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11660685/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142878072","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Applications of mixed reality with medical imaging for training and clinical practice. 混合现实与医学成像在培训和临床实践中的应用。
IF 1.9
Journal of Medical Imaging Pub Date : 2024-11-01 Epub Date: 2024-12-26 DOI: 10.1117/1.JMI.11.6.062608
Alexa R Lauinger, Meagan McNicholas, Matthew Bramlet, Maria Bederson, Bradley P Sutton, Caroline G L Cao, Irfan S Ahmad, Carlos Brown, Shandra Jamison, Sarita Adve, John Vozenilek, Jim Rehg, Mark S Cohen
{"title":"Applications of mixed reality with medical imaging for training and clinical practice.","authors":"Alexa R Lauinger, Meagan McNicholas, Matthew Bramlet, Maria Bederson, Bradley P Sutton, Caroline G L Cao, Irfan S Ahmad, Carlos Brown, Shandra Jamison, Sarita Adve, John Vozenilek, Jim Rehg, Mark S Cohen","doi":"10.1117/1.JMI.11.6.062608","DOIUrl":"10.1117/1.JMI.11.6.062608","url":null,"abstract":"<p><strong>Purpose: </strong>This review summarizes the current use of extended reality (XR) including virtual reality (VR), mixed reality, and augmented reality (AR) in the medical field, ranging from medical imaging to training to preoperative planning. It covers the integration of these technologies into clinical practice and within medical training while discussing the challenges and future opportunities in this sphere. This will hopefully encourage more physicians to collaborate on integrating medicine and technology.</p><p><strong>Approach: </strong>The review was written by experts in the field based on their knowledge and on recent publications exploring the topic of extended realities in medicine.</p><p><strong>Results: </strong>Based on our findings, XR including VR, mixed reality, and AR are increasingly utilized within surgery both for preoperative planning and intraoperative procedures. These technologies are also promising means for improved education at every level of physician training. However, there are still barriers to the widespread adoption of VR, mixed reality, and AR, including human factors, technological challenges, and regulatory issues.</p><p><strong>Conclusions: </strong>Based on the current use of VR, mixed reality, and AR, it is likely that the use of these technologies will continue to grow over the next decade. To support the development and integration of XR into medicine, it is important for academic groups to collaborate with industrial groups and regulatory agencies in these endeavors. These joint projects will help address the current limitations and mutually benefit both fields.</p>","PeriodicalId":47707,"journal":{"name":"Journal of Medical Imaging","volume":"11 6","pages":"062608"},"PeriodicalIF":1.9,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11669596/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142903873","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Augmented and virtual reality imaging for collaborative planning of structural cardiovascular interventions: a proof-of-concept and validation study. 增强和虚拟现实成像用于心血管结构干预的协作规划:概念验证和验证研究。
IF 1.9
Journal of Medical Imaging Pub Date : 2024-11-01 Epub Date: 2024-10-08 DOI: 10.1117/1.JMI.11.6.062606
Xander Jacquemyn, Kobe Bamps, Ruben Moermans, Christophe Dubois, Filip Rega, Peter Verbrugghe, Barbara Weyn, Steven Dymarkowski, Werner Budts, Alexander Van De Bruaene
{"title":"Augmented and virtual reality imaging for collaborative planning of structural cardiovascular interventions: a proof-of-concept and validation study.","authors":"Xander Jacquemyn, Kobe Bamps, Ruben Moermans, Christophe Dubois, Filip Rega, Peter Verbrugghe, Barbara Weyn, Steven Dymarkowski, Werner Budts, Alexander Van De Bruaene","doi":"10.1117/1.JMI.11.6.062606","DOIUrl":"10.1117/1.JMI.11.6.062606","url":null,"abstract":"<p><strong>Purpose: </strong>Virtual reality (VR) and augmented reality (AR) have led to significant advancements in cardiac preoperative planning, shaping the world in profound ways. A noticeable gap exists in the availability of a comprehensive multi-user, multi-device mixed reality application that can be used in a multidisciplinary team meeting.</p><p><strong>Approach: </strong>A multi-user, multi-device mixed reality application was developed, supporting AR and VR implementations. Technical validation involved a standardized testing protocol and comparison of AR and VR measurements regarding absolute error and time. Preclinical validation engaged experts in interventional cardiology, evaluating the clinical applicability prior to clinical validation. Clinical validation included patient-specific measurements for five patients in VR compared with standard computed tomography (CT) for preoperative planning. Questionnaires were used at all stages for subjective evaluation.</p><p><strong>Results: </strong>Technical validation, including 106 size measurements, demonstrated an absolute median error of 0.69 mm (0.25 to 1.18 mm) compared with ground truth. The time to complete the entire task was <math><mrow><mn>892</mn> <mo>±</mo> <mn>407</mn> <mtext>  </mtext> <mi>s</mi></mrow> </math> on average, with VR measurements being faster than AR ( <math><mrow><mn>804</mn> <mo>±</mo> <mn>483</mn></mrow> </math> versus <math><mrow><mn>957</mn> <mo>±</mo> <mn>257</mn> <mtext>  </mtext> <mi>s</mi></mrow> </math> , <math><mrow><mi>P</mi> <mo>=</mo> <mn>0.045</mn></mrow> </math> ). On clinical validation of five preoperative patients, there was no statistically significant difference between paired CT and VR measurements (0.58 [95% CI, <math><mrow><mo>-</mo> <mn>1.58</mn></mrow> </math> to 2.74], <math><mrow><mi>P</mi> <mo>=</mo> <mn>0.586</mn></mrow> </math> ). Questionnaires showcased unanimous agreement on the user-friendly nature, effectiveness, and clinical value.</p><p><strong>Conclusions: </strong>The mixed reality application, validated through technical, preclinical, and clinical assessments, demonstrates precision and user-friendliness. Further research of our application is needed to validate the generalizability and impact on patient outcomes.</p>","PeriodicalId":47707,"journal":{"name":"Journal of Medical Imaging","volume":"11 6","pages":"062606"},"PeriodicalIF":1.9,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11460359/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142394282","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Polarimetry terahertz imaging of human breast cancer surgical specimens. 人类乳腺癌手术标本的偏振太赫兹成像。
IF 1.9
Journal of Medical Imaging Pub Date : 2024-11-01 Epub Date: 2024-12-05 DOI: 10.1117/1.JMI.11.6.065503
Nikita Gurjar, Keith Bailey, Magda El-Shenawee
{"title":"Polarimetry terahertz imaging of human breast cancer surgical specimens.","authors":"Nikita Gurjar, Keith Bailey, Magda El-Shenawee","doi":"10.1117/1.JMI.11.6.065503","DOIUrl":"10.1117/1.JMI.11.6.065503","url":null,"abstract":"<p><strong>Purpose: </strong>We investigate terahertz (THz) polarimetry imaging of seven human breast cancer surgical specimens. The goal is to enhance image contrast between adjacent tissue types of cancer, healthy collagen, and fat in excised breast tumors. Based on the biological perception of random growth of cancer and invasion of surrounding healthy tissues in the breast, we hypothesize that cancerous cells interact with the THz electric field in a different manner compared with healthy cells. This difference can be best captured using multiple polarizations instead of single polarization.</p><p><strong>Approach: </strong>Time domain pulsed signals are experimentally collected from each pixel of the specimen in horizontal-horizontal, vertical-horizontal, vertical-vertical, and horizontal-vertical polarizations. The time domain pulses are transformed to the frequency domain to obtain the power spectra and 16 Mueller matrix images. The whole-slide pathology imaging was used to interpret and label all images.</p><p><strong>Results: </strong>The results of the cross and co-polarization power spectrum images demonstrated a strong dependency on the tissue orientation with respect to the emitted and detected electric fields. At the 130-deg rotation angle of the scanned samples, the detector showed the strongest reflected signal in cross-polarization. Furthermore, the Mueller matrix images consistently demonstrated patterns in fresh and block tissues confirming the differentiation between tissue types in breast tumor specimens.</p><p><strong>Conclusions: </strong>THz polarimetry imaging shows a potential for improving image contrast in excised tumor tissues compared with single polarization imaging. Cross-polarization signals demonstrated smaller amplitudes compared with co-polarized signals. However, averaging the signal during measurements has tremendously improved the image. Furthermore, in post-processing, averaging the frequency domain images and the Mueller matrix elements with respect to frequency has led to better image contrast. Some patterns in the Mueller matrix images were difficult to interpret leading to the necessity of more investigation of the Mueller matrix and its physiological interpretation of breast tumor tissues.</p>","PeriodicalId":47707,"journal":{"name":"Journal of Medical Imaging","volume":"11 6","pages":"065503"},"PeriodicalIF":1.9,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11619717/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142802587","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信