Sebastian Milde, Annika Liebgott, Ziwei Wu, Wenyi Feng, Jiahuan Yang, Lukas Mauch, P. Martirosian, F. Bamberg, K. Nikolaou, S. Gatidis, F. Schick, Bin Yang, Thomas Kustner
{"title":"Graphical User Interface for Medical Deep Learning - Application to Magnetic Resonance Imaging","authors":"Sebastian Milde, Annika Liebgott, Ziwei Wu, Wenyi Feng, Jiahuan Yang, Lukas Mauch, P. Martirosian, F. Bamberg, K. Nikolaou, S. Gatidis, F. Schick, Bin Yang, Thomas Kustner","doi":"10.23919/APSIPA.2018.8659515","DOIUrl":null,"url":null,"abstract":"In clinical diagnostic, magnetic resonance imaging (MRI) is a valuable and versatile tool. The acquisition process is, however, susceptible to image distortions (artifacts) which may lead to degradation of image quality. Automated and reference-free localization and quantification of artifacts by employing convolutional neural networks (CNNs) is a promising way for early detection of artifacts. Training relies on high amount of expert labeled data which is a time-demanding process. Previous studies were based on global labels, i.e. a whole volume was automatically labeled as artifact-free or artifact-affected. However, artifact appearance is rather localized. We propose a local labeling which is conducted via a graphical user interface (GUI). Moreover, the GUI provides easy handling of data viewing, preprocessing (labeling, patching, data augmentation), network parametrization and training, data and network evaluation as well as deep visualization of the learned network content. The GUI is not limited to these features and will be extended in the future. The developed GUI is made publicly available and features a modular outline to target different applications of machine learning and deep learning, such as artifact detection, classification and segmentation.","PeriodicalId":287799,"journal":{"name":"2018 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/APSIPA.2018.8659515","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
In clinical diagnostic, magnetic resonance imaging (MRI) is a valuable and versatile tool. The acquisition process is, however, susceptible to image distortions (artifacts) which may lead to degradation of image quality. Automated and reference-free localization and quantification of artifacts by employing convolutional neural networks (CNNs) is a promising way for early detection of artifacts. Training relies on high amount of expert labeled data which is a time-demanding process. Previous studies were based on global labels, i.e. a whole volume was automatically labeled as artifact-free or artifact-affected. However, artifact appearance is rather localized. We propose a local labeling which is conducted via a graphical user interface (GUI). Moreover, the GUI provides easy handling of data viewing, preprocessing (labeling, patching, data augmentation), network parametrization and training, data and network evaluation as well as deep visualization of the learned network content. The GUI is not limited to these features and will be extended in the future. The developed GUI is made publicly available and features a modular outline to target different applications of machine learning and deep learning, such as artifact detection, classification and segmentation.