Image upsampling via texture hallucination

Yoav HaCohen, Raanan Fattal, Dani Lischinski
{"title":"Image upsampling via texture hallucination","authors":"Yoav HaCohen, Raanan Fattal, Dani Lischinski","doi":"10.1109/ICCPHOT.2010.5585097","DOIUrl":null,"url":null,"abstract":"Image upsampling is a common yet challenging task, since it is severely underconstrained. While considerable progress was made in preserving the sharpness of salient edges, current methods fail to reproduce the fine detail typically present in the textured regions bounded by these edges, resulting in unrealistic appearance. In this paper we address this fundamental shortcoming by integrating higher-level image analysis and custom low-level image synthesis. Our approach extends and refines the patch-based image model of Freeman et al. [10] and interprets the image as a tiling of distinct textures, each of which is matched to an example in a database of relevant textures. The matching is not done at the patch level, but rather collectively, over entire segments. Following this model fitting stage, which requires some user guidance, a higher-resolution image is synthesized using a hybrid approach that incorporates principles from example-based texture synthesis. We show that for images that comply with our model, our method is able to reintroduce consistent fine-scale detail, resulting in enhanced appearance textured regions.","PeriodicalId":248821,"journal":{"name":"2010 IEEE International Conference on Computational Photography (ICCP)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-03-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"80","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 IEEE International Conference on Computational Photography (ICCP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCPHOT.2010.5585097","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 80

Abstract

Image upsampling is a common yet challenging task, since it is severely underconstrained. While considerable progress was made in preserving the sharpness of salient edges, current methods fail to reproduce the fine detail typically present in the textured regions bounded by these edges, resulting in unrealistic appearance. In this paper we address this fundamental shortcoming by integrating higher-level image analysis and custom low-level image synthesis. Our approach extends and refines the patch-based image model of Freeman et al. [10] and interprets the image as a tiling of distinct textures, each of which is matched to an example in a database of relevant textures. The matching is not done at the patch level, but rather collectively, over entire segments. Following this model fitting stage, which requires some user guidance, a higher-resolution image is synthesized using a hybrid approach that incorporates principles from example-based texture synthesis. We show that for images that comply with our model, our method is able to reintroduce consistent fine-scale detail, resulting in enhanced appearance textured regions.
通过纹理幻觉进行图像上采样
图像上采样是一项常见但具有挑战性的任务,因为它严重缺乏约束。虽然在保持突出边缘的清晰度方面取得了相当大的进展,但目前的方法无法再现由这些边缘包围的纹理区域中典型的精细细节,从而导致不现实的外观。在本文中,我们通过集成高级图像分析和自定义低级图像合成来解决这一基本缺陷。我们的方法扩展并改进了Freeman等人[10]的基于补丁的图像模型,并将图像解释为不同纹理的平铺,每个纹理都与相关纹理数据库中的示例相匹配。匹配不是在补丁级别进行的,而是在整个段上进行的。在这个需要一些用户指导的模型拟合阶段之后,使用结合基于示例的纹理合成原理的混合方法合成更高分辨率的图像。我们表明,对于符合我们模型的图像,我们的方法能够重新引入一致的精细尺度细节,从而增强外观纹理区域。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信