{"title":"Comparison of Performance by Activation Functions on Deep Image Prior","authors":"Shohei Fujii, H. Hayashi","doi":"10.1109/ICAIIC.2019.8669063","DOIUrl":null,"url":null,"abstract":"In this paper, we compare the performance of activation functions on a deep image prior. The activation functions considered here are the standard rectified linear unit (ReLU), leaky rectified linear unit (Leaky ReLU), and the randomized leaky rectified linear unit (RReLU). We use these functions for denoising, super-resolution, and inpainting of the deep image prior. Our aim is to observe the effect of differences in the activation functions.","PeriodicalId":273383,"journal":{"name":"2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAIIC.2019.8669063","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
In this paper, we compare the performance of activation functions on a deep image prior. The activation functions considered here are the standard rectified linear unit (ReLU), leaky rectified linear unit (Leaky ReLU), and the randomized leaky rectified linear unit (RReLU). We use these functions for denoising, super-resolution, and inpainting of the deep image prior. Our aim is to observe the effect of differences in the activation functions.