A Deep Convolutional Neural Network with Selection Units for Super-Resolution

Jae-Seok Choi, Munchurl Kim
{"title":"A Deep Convolutional Neural Network with Selection Units for Super-Resolution","authors":"Jae-Seok Choi, Munchurl Kim","doi":"10.1109/CVPRW.2017.153","DOIUrl":null,"url":null,"abstract":"Rectified linear units (ReLU) are known to be effective in many deep learning methods. Inspired by linear-mapping technique used in other super-resolution (SR) methods, we reinterpret ReLU into point-wise multiplication of an identity mapping and a switch, and finally present a novel nonlinear unit, called a selection unit (SU). While conventional ReLU has no direct control through which data is passed, the proposed SU optimizes this on-off switching control, and is therefore capable of better handling nonlinearity functionality than ReLU in a more flexible way. Our proposed deep network with SUs, called SelNet, was top-5th ranked in NTIRE2017 Challenge, which has a much lower computation complexity compared to the top-4 entries. Further experiment results show that our proposed SelNet outperforms our baseline only with ReLU (without SUs), and other state-of-the-art deep-learning-based SR methods.","PeriodicalId":6668,"journal":{"name":"2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)","volume":"37 4 1","pages":"1150-1156"},"PeriodicalIF":0.0000,"publicationDate":"2017-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"103","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CVPRW.2017.153","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 103

Abstract

Rectified linear units (ReLU) are known to be effective in many deep learning methods. Inspired by linear-mapping technique used in other super-resolution (SR) methods, we reinterpret ReLU into point-wise multiplication of an identity mapping and a switch, and finally present a novel nonlinear unit, called a selection unit (SU). While conventional ReLU has no direct control through which data is passed, the proposed SU optimizes this on-off switching control, and is therefore capable of better handling nonlinearity functionality than ReLU in a more flexible way. Our proposed deep network with SUs, called SelNet, was top-5th ranked in NTIRE2017 Challenge, which has a much lower computation complexity compared to the top-4 entries. Further experiment results show that our proposed SelNet outperforms our baseline only with ReLU (without SUs), and other state-of-the-art deep-learning-based SR methods.
具有超分辨率选择单元的深度卷积神经网络
整流线性单元(ReLU)在许多深度学习方法中都是有效的。受其他超分辨率(SR)方法中使用的线性映射技术的启发,我们将ReLU重新解释为单位映射和开关的逐点乘法,最后提出了一种新的非线性单元,称为选择单元(SU)。传统的ReLU没有数据传递的直接控制,而该SU优化了这种开关控制,因此能够以更灵活的方式比ReLU更好地处理非线性功能。我们提出的带有SUs的深度网络SelNet在NTIRE2017挑战赛中排名前5,与前4名的参赛作品相比,其计算复杂度要低得多。进一步的实验结果表明,我们提出的SelNet仅在使用ReLU(没有SUs)和其他最先进的基于深度学习的SR方法时才优于我们的基线。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信