{"title":"Enhancing Image Clarity with the Combined Use of REDNet and Attention Channel Module","authors":"Rico Halim, Gede Putra Kusuma","doi":"10.12785/ijcds/160117","DOIUrl":null,"url":null,"abstract":": The primary aim of our study is to improve the e ffi cacy of image denoising, specifically in situations when there is a limited availability of data, such as the BSD68 dataset. Insu ffi cient data presents a challenge in achieving optimal outcomes due to the complexity involved in constructing models. In order to tackle this di ffi culty, we provide a method that incorporates Channel Attention, Batch Normalization, and Dropout approaches into the current REDNet framework. Our investigation indicates enhancements in performance parameters, such as PSNR (Peak Signal to Noise Ratio) and SSIM (Structural Similarity Index), across various levels of noise. With a noise level of 15, we obtained a Peak Signal-to-Noise Ratio (PSNR) of 34.9858 dB and a Structural Similarity Index (SSIM) of 0.9371. At a noise level of 25, our tests yielded a PSNR of 31.7886 decibels and an SSIM of 0.8876. In addition, at a noise level of 50, we achieved a Peak Signal-to-Noise Ratio (PSNR) of 27.9063 decibels and a Structural Similarity Index (SSIM) of 0.7754. The incorporation of Channel Attention, Batch Normalization, and Dropout has been demonstrated to be a crucial element in enhancing the e ffi cacy of image denoising. The Channel Attention approach enables the model to choose and concentrate on crucial information inside the image, while Batch Normalization and Dropout techniques provide stability and mitigate overfitting issues throughout the training process. Our research highlights the e ff ectiveness of these three strategies and emphasizes their integration as a novel way to address the constraints presented by the scarcity of data in image denoising jobs. This emphasizes the significant potential in creating dependable and e ff ective image denoising methods when dealing with circumstances when there is a limited dataset.","PeriodicalId":37180,"journal":{"name":"International Journal of Computing and Digital Systems","volume":"59 52","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Computing and Digital Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.12785/ijcds/160117","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
: The primary aim of our study is to improve the e ffi cacy of image denoising, specifically in situations when there is a limited availability of data, such as the BSD68 dataset. Insu ffi cient data presents a challenge in achieving optimal outcomes due to the complexity involved in constructing models. In order to tackle this di ffi culty, we provide a method that incorporates Channel Attention, Batch Normalization, and Dropout approaches into the current REDNet framework. Our investigation indicates enhancements in performance parameters, such as PSNR (Peak Signal to Noise Ratio) and SSIM (Structural Similarity Index), across various levels of noise. With a noise level of 15, we obtained a Peak Signal-to-Noise Ratio (PSNR) of 34.9858 dB and a Structural Similarity Index (SSIM) of 0.9371. At a noise level of 25, our tests yielded a PSNR of 31.7886 decibels and an SSIM of 0.8876. In addition, at a noise level of 50, we achieved a Peak Signal-to-Noise Ratio (PSNR) of 27.9063 decibels and a Structural Similarity Index (SSIM) of 0.7754. The incorporation of Channel Attention, Batch Normalization, and Dropout has been demonstrated to be a crucial element in enhancing the e ffi cacy of image denoising. The Channel Attention approach enables the model to choose and concentrate on crucial information inside the image, while Batch Normalization and Dropout techniques provide stability and mitigate overfitting issues throughout the training process. Our research highlights the e ff ectiveness of these three strategies and emphasizes their integration as a novel way to address the constraints presented by the scarcity of data in image denoising jobs. This emphasizes the significant potential in creating dependable and e ff ective image denoising methods when dealing with circumstances when there is a limited dataset.