2009 IEEE International Conference on Computational Photography (ICCP)最新文献

筛选
英文 中文
Light field superresolution 光场超分辨率
2009 IEEE International Conference on Computational Photography (ICCP) Pub Date : 2009-04-16 DOI: 10.1109/ICCPHOT.2009.5559010
Tom E. Bishop, Sara Zanetti, P. Favaro
{"title":"Light field superresolution","authors":"Tom E. Bishop, Sara Zanetti, P. Favaro","doi":"10.1109/ICCPHOT.2009.5559010","DOIUrl":"https://doi.org/10.1109/ICCPHOT.2009.5559010","url":null,"abstract":"Light field cameras have been recently shown to be very effective in applications such as digital refocusing and 3D reconstruction. In a single snapshot these cameras provide a sample of the light field of a scene by trading off spatial resolution with angular resolution. Current methods produce images at a resolution that is much lower than that of traditional imaging devices. However, by explicitly modeling the image formation process and incorporating priors such as Lambertianity and texture statistics, these types of images can be reconstructed at a higher resolution. We formulate this method in a variational Bayesian framework and perform the reconstruction of both the surface of the scene and the (superresolved) light field. The method is demonstrated on both synthetic and real images captured with our light-field camera prototype.","PeriodicalId":283831,"journal":{"name":"2009 IEEE International Conference on Computational Photography (ICCP)","volume":"117 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115629536","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 222
Motion field and occlusion time estimation via alternate exposure flow 交替曝光流估计运动场和遮挡时间
2009 IEEE International Conference on Computational Photography (ICCP) Pub Date : 2009-04-16 DOI: 10.1109/ICCPHOT.2009.5559005
A. Sellent, M. Eisemann, M. Magnor
{"title":"Motion field and occlusion time estimation via alternate exposure flow","authors":"A. Sellent, M. Eisemann, M. Magnor","doi":"10.1109/ICCPHOT.2009.5559005","DOIUrl":"https://doi.org/10.1109/ICCPHOT.2009.5559005","url":null,"abstract":"This paper presents an extension to optical flow-based motion estimation using alternating short- and long-exposed images. While traditional optical flow algorithms rely on consecutive short-exposed images only, long-exposed images capture motion directly in the form of motion blur. This additional information can be used to achieve more robust and accurate motion field estimation as well as to extract the moment of occlusion. We introduce an image formation model that relates the long-exposed image to its preceding and succeeding short-exposed images in terms of dense 2D motion and per-pixel occlusion/disocclusion timings. Based on this image formation model, we describe a practical algorithm to estimate the motion field not only for completely visible image regions but also for pixels becoming occluded. For these pixels the Alternate Exposure Flow (AEF) also determines the moment of occlusion. We describe the application of AEF in frame interpolation to demonstrate the advantage of the additional long exposure information","PeriodicalId":283831,"journal":{"name":"2009 IEEE International Conference on Computational Photography (ICCP)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116862709","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Recovery limits in pointwise degradation 逐点降解中的恢复限制
2009 IEEE International Conference on Computational Photography (ICCP) Pub Date : 2009-04-16 DOI: 10.1109/ICCPHOT.2009.5559011
T. Treibitz, Y. Schechner
{"title":"Recovery limits in pointwise degradation","authors":"T. Treibitz, Y. Schechner","doi":"10.1109/ICCPHOT.2009.5559011","DOIUrl":"https://doi.org/10.1109/ICCPHOT.2009.5559011","url":null,"abstract":"Pointwise image formation models appear in a variety of computational vision and photography problems. Prior studies aim to recover visibility or reflectance under the effects of specular or indirect reflections, additive scattering, radiance attenuation in haze and flash, etc. This work considers bounds to recovery from pointwise degradation. The analysis uses a physical model for the acquired signal and noise, and also accounts for potential post-acquisition noise filtering. Linear-systems analysis yields an effective cutoff-frequency, which is induced by noise, despite having no optical blur in the imaging model. We apply this analysis to hazy images. The result is a tool that assesses the ability to recover (within a desirable success rate) an object or feature having a certain size, distance from the camera, and radiance difference from its nearby background, per attenuation coefficient of the medium. The bounds rely on the camera specifications. The theory considers the pointwise degradation that exists in the scene during acquisition, which fundamentally limits recovery, even if the parameters of an algorithm are perfectly set.","PeriodicalId":283831,"journal":{"name":"2009 IEEE International Conference on Computational Photography (ICCP)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126448428","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 29
Enhancing and experiencing spacetime resolution with videos and stills 通过视频和静态图像增强和体验时空分辨率
2009 IEEE International Conference on Computational Photography (ICCP) Pub Date : 2009-04-16 DOI: 10.1109/ICCPHOT.2009.5559006
Ankit Gupta, Pravin Bhat, Mira Dontcheva, O. Deussen, B. Curless, Michael F. Cohen
{"title":"Enhancing and experiencing spacetime resolution with videos and stills","authors":"Ankit Gupta, Pravin Bhat, Mira Dontcheva, O. Deussen, B. Curless, Michael F. Cohen","doi":"10.1109/ICCPHOT.2009.5559006","DOIUrl":"https://doi.org/10.1109/ICCPHOT.2009.5559006","url":null,"abstract":"We present solutions for enhancing the spatial and/or temporal resolution of videos. Our algorithm targets the emerging consumer-level hybrid cameras that can simultaneously capture video and high-resolution stills. Our technique produces a high spacetime resolution video using the high-resolution stills for rendering and the low-resolution video to guide the reconstruction and the rendering process. Our framework integrates and extends two existing algorithms, namely a high-quality optical flow algorithm and a high-quality image-based-rendering algorithm. The framework enables a variety of applications that were previously unavailable to the amateur user, such as the ability to (1) automatically create videos with high spatiotemporal resolution, and (2) shift a high-resolution still to nearby points in time to better capture a missed event.","PeriodicalId":283831,"journal":{"name":"2009 IEEE International Conference on Computational Photography (ICCP)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127006199","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 41
Image-based separation of diffuse and specular reflections using environmental structured illumination 使用环境结构照明的漫反射和镜面反射的基于图像的分离
2009 IEEE International Conference on Computational Photography (ICCP) Pub Date : 2009-04-16 DOI: 10.1109/ICCPHOT.2009.5559012
B. Lamond, P. Peers, A. Ghosh, P. Debevec
{"title":"Image-based separation of diffuse and specular reflections using environmental structured illumination","authors":"B. Lamond, P. Peers, A. Ghosh, P. Debevec","doi":"10.1109/ICCPHOT.2009.5559012","DOIUrl":"https://doi.org/10.1109/ICCPHOT.2009.5559012","url":null,"abstract":"We present an image-based method for separating diffuse and specular reflections using environmental structured illumination. Two types of structured illumination are discussed: phase-shifted sine wave patterns, and phase-shifted binary stripe patterns. In both cases the low-pass filtering nature of diffuse reflections is utilized to separate the reflection components. We illustrate our method on a wide range of example scenes and applications.","PeriodicalId":283831,"journal":{"name":"2009 IEEE International Conference on Computational Photography (ICCP)","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123447516","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 25
The focused plenoptic camera 聚焦全光学相机
2009 IEEE International Conference on Computational Photography (ICCP) Pub Date : 2009-04-16 DOI: 10.1109/ICCPHOT.2009.5559008
A. Lumsdaine, Todor Georgiev
{"title":"The focused plenoptic camera","authors":"A. Lumsdaine, Todor Georgiev","doi":"10.1109/ICCPHOT.2009.5559008","DOIUrl":"https://doi.org/10.1109/ICCPHOT.2009.5559008","url":null,"abstract":"Plenoptic cameras, constructed with internal microlens arrays, focus those microlenses at infinity in order to sample the 4D radiance directly at the microlenses. The consequent assumption is that each microlens image is completely defocused with respect to to the image created by the main camera lens and the outside object. As a result, only a single pixel in the final image can be rendered from it, resulting in disappointingly low resolution. In this paper, we present a new approach to lightfield capture and image rendering that interprets the microlens array as an imaging system focused on the focal plane of the main camera lens. This approach captures a lightfield with significantly higher spatial resolution than the traditional approach, allowing us to render high resolution images that meet the expectations of modern photographers. Although the new approach samples the lightfield with reduced angular density, analysis and experimental results demonstrate that there is sufficient parallax to completely support lightfield manipulation algorithms such as refocusing and novel views","PeriodicalId":283831,"journal":{"name":"2009 IEEE International Conference on Computational Photography (ICCP)","volume":"105 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116692240","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 466
Artifact-free High Dynamic Range imaging 无伪影高动态范围成像
2009 IEEE International Conference on Computational Photography (ICCP) Pub Date : 2009-04-16 DOI: 10.1109/ICCPHOT.2009.5559003
Orazio Gallo, Natasha Gelfandz, Wei-Chao Chen, M. Tico, K. Pulli
{"title":"Artifact-free High Dynamic Range imaging","authors":"Orazio Gallo, Natasha Gelfandz, Wei-Chao Chen, M. Tico, K. Pulli","doi":"10.1109/ICCPHOT.2009.5559003","DOIUrl":"https://doi.org/10.1109/ICCPHOT.2009.5559003","url":null,"abstract":"The contrast in real world scenes is often beyond what consumer cameras can capture. For these situations, High Dynamic Range (HDR) images can be generated by taking multiple exposures of the same scene. When fusing information from different images, however, the slightest change in the scene can generate artifacts which dramatically limit the potential of this solution. We present a technique capable of dealing with a large amount of movement in the scene: we find, in all the available exposures, patches consistent with a reference image previously selected from the stack. We generate the HDR image by averaging the radiance estimates of all such regions and we compensate for camera calibration errors by removing potential seams. We show that our method works even in cases when many moving objects cover large regions of the scene.","PeriodicalId":283831,"journal":{"name":"2009 IEEE International Conference on Computational Photography (ICCP)","volume":"848 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133384024","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 236
Online blind deconvolution for astronomical imaging 天文成像的在线盲反褶积
2009 IEEE International Conference on Computational Photography (ICCP) Pub Date : 2009-04-16 DOI: 10.1109/ICCPHOT.2009.5559014
S. Harmeling, M. Hirsch, S. Sra, B. Scholkopf
{"title":"Online blind deconvolution for astronomical imaging","authors":"S. Harmeling, M. Hirsch, S. Sra, B. Scholkopf","doi":"10.1109/ICCPHOT.2009.5559014","DOIUrl":"https://doi.org/10.1109/ICCPHOT.2009.5559014","url":null,"abstract":"Atmospheric turbulences blur astronomical images taken by earth-based telescopes. Taking many short-time exposures in such a situation provides noisy images of the same object, where each noisy image has a different blur. Commonly astronomers apply a technique called “Lucky Imaging” that selects a few of the recorded frames that fulfill certain criteria, such as reaching a certain peak intensity (“Strehl ratio”). The selected frames are then averaged to obtain a better image. In this paper we introduce and analyze a new method that exploits all the frames and generates an improved image in an online fashion. Our initial experiments with controlled artificial data and real-world astronomical datasets yields promising results.","PeriodicalId":283831,"journal":{"name":"2009 IEEE International Conference on Computational Photography (ICCP)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121994255","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 51
Denoising photographs using dark frames optimized by quadratic programming 利用二次规划优化的暗帧图像去噪
2009 IEEE International Conference on Computational Photography (ICCP) Pub Date : 2009-04-16 DOI: 10.1109/ICCPHOT.2009.5559013
M. Gomez-Rodriguez, J. Kober, B. Scholkopf
{"title":"Denoising photographs using dark frames optimized by quadratic programming","authors":"M. Gomez-Rodriguez, J. Kober, B. Scholkopf","doi":"10.1109/ICCPHOT.2009.5559013","DOIUrl":"https://doi.org/10.1109/ICCPHOT.2009.5559013","url":null,"abstract":"Photographs taken with long exposure or high ISO setting may contain substantial amounts of noise, drastically reducing the Signal-To-Noise Ratio (SNR). This paper presents a novel optimization approach for denoising. It is based on a library of dark frames previously taken under varying conditions of temperature, ISO setting and exposure time, and a quality measure or prior for the class of images to denoise. The method automatically computes a synthetic dark frame that, when subtracted from an image, optimizes the quality measure. For specific choices of the quality measure, the denoising problem reduces to a quadratic programming (QP) problem that can be solved efficiently. We show experimentally that it is sufficient to consider a limited subsample of pixels when evaluating the quality measure in the optimization, in which case the complexity of the procedure does not depend on the size of the images but only on the number of dark frames. We provide quantitative experimental results showing that our method automatically computes dark frames that are competitive with those taken under idealized conditions (controlled temperature, ISO setting, exposure time, and averaging of multiple exposures). We provide application examples in astronomical image denoising. The method is validated on two CMOS SLRs.","PeriodicalId":283831,"journal":{"name":"2009 IEEE International Conference on Computational Photography (ICCP)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124306438","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
What are good apertures for defocus deblurring? 散焦去模糊的好光圈是什么?
2009 IEEE International Conference on Computational Photography (ICCP) Pub Date : 2009-04-16 DOI: 10.1109/ICCPHOT.2009.5559018
Changyin Zhou, S. Nayar
{"title":"What are good apertures for defocus deblurring?","authors":"Changyin Zhou, S. Nayar","doi":"10.1109/ICCPHOT.2009.5559018","DOIUrl":"https://doi.org/10.1109/ICCPHOT.2009.5559018","url":null,"abstract":"In recent years, with camera pixels shrinking in size, images are more likely to include defocused regions. In order to recover scene details from defocused regions, deblurring techniques must be applied. It is well known that the quality of a deblurred image is closely related to the defocus kernel, which is determined by the pattern of the aperture. The design of aperture patterns has been studied for decades in several fields, including optics, astronomy, computer vision, and computer graphics. However, previous attempts at designing apertures have been based on intuitive criteria related to the shape of the power spectrum of the aperture pattern. In this paper, we present a comprehensive framework for evaluating an aperture pattern based on the quality of deblurring. Our criterion explicitly accounts for the effects of image noise and the statistics of natural images. Based on our criterion, we have developed a genetic algorithm that converges very quickly to near-optimal aperture patterns. We have conducted extensive simulations and experiments to compare our apertures with previously proposed ones.","PeriodicalId":283831,"journal":{"name":"2009 IEEE International Conference on Computational Photography (ICCP)","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129448852","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 157
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信