{"title":"NUC-Fuse: Multimodal medical image fusion using nuclear norm & classification of brain tumors using ARBFN","authors":"Shihabudeen H. , Rajeesh J.","doi":"10.1016/j.ibmed.2024.100181","DOIUrl":null,"url":null,"abstract":"<div><div>Medical imaging has been widely used to diagnose diseases over the past two decades. The lack of information in this field makes it difficult for medical experts to diagnose diseases with a single modality. The combination of image fusion techniques enables the integration of pictures depicting various tissues and disorders from multiple medical imaging devices, facilitating enhanced research and treatment by providing complementary information through multimodal medical imaging fusion. The proposed work employs the nuclear norm and residual connections to combine the complementary features from both CT and MRI imaging approaches. The autoencoder eventually creates a merged image. The fused pictures are categorized as benign or malignant in the following phase using the present Radial Basis Function Network (RBFN). The performance measures, such as Mutual Information, Structural Similarity Index Measure, <span><math><msub><mrow><mi>Q</mi></mrow><mrow><mi>w</mi></mrow></msub></math></span>, and <span><math><msub><mrow><mi>Q</mi></mrow><mrow><mi>e</mi></mrow></msub></math></span>, have shown improved values, specifically 4.6328, 0.6492, 0.8300, and 0.8185 respectively, when compared with different fusion methods. Additionally, the classification algorithm yields 97% accuracy, 89% precision, and 92% recall when combined with the proposed fusion algorithm.</div></div>","PeriodicalId":73399,"journal":{"name":"Intelligence-based medicine","volume":"10 ","pages":"Article 100181"},"PeriodicalIF":0.0000,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Intelligence-based medicine","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666521224000486","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Medical imaging has been widely used to diagnose diseases over the past two decades. The lack of information in this field makes it difficult for medical experts to diagnose diseases with a single modality. The combination of image fusion techniques enables the integration of pictures depicting various tissues and disorders from multiple medical imaging devices, facilitating enhanced research and treatment by providing complementary information through multimodal medical imaging fusion. The proposed work employs the nuclear norm and residual connections to combine the complementary features from both CT and MRI imaging approaches. The autoencoder eventually creates a merged image. The fused pictures are categorized as benign or malignant in the following phase using the present Radial Basis Function Network (RBFN). The performance measures, such as Mutual Information, Structural Similarity Index Measure, , and , have shown improved values, specifically 4.6328, 0.6492, 0.8300, and 0.8185 respectively, when compared with different fusion methods. Additionally, the classification algorithm yields 97% accuracy, 89% precision, and 92% recall when combined with the proposed fusion algorithm.