Eurographics Symposium on Rendering最新文献

筛选
英文 中文
A dual light stage 双光舞台
Eurographics Symposium on Rendering Pub Date : 2005-06-29 DOI: 10.2312/EGWR/EGSR05/091-098
Tim Hawkins, P. Einarsson, P. Debevec
{"title":"A dual light stage","authors":"Tim Hawkins, P. Einarsson, P. Debevec","doi":"10.2312/EGWR/EGSR05/091-098","DOIUrl":"https://doi.org/10.2312/EGWR/EGSR05/091-098","url":null,"abstract":"We present a technique for capturing high-resolution 4D reflectance fields using the reciprocity property of light transport. In our technique we place the object inside a diffuse spherical shell and scan a laser across its surface. For each incident ray, the object scatters a pattern of light onto the inner surface of the sphere, and we photograph the resulting radiance from the sphere's interior using a camera with a fisheye lens. Because of reciprocity, the image of the inside of the sphere corresponds to the reflectance function of the surface point illuminated by the laser, that is, the color that point would appear to a camera along the laser ray when the object is lit from each direction on the surface of the sphere. The measured reflectance functions allow the object to be photorealistically rendered from the laser's viewpoint under arbitrary directional illumination conditions. Since each captured re- flectance function is a high-resolution image, our data reproduces sharp specular reflections and self-shadowing more accurately than previous approaches. We demonstrate our technique by scanning objects with a wide range of reflectance properties and show accurate renderings of the objects under novel illumination conditions.","PeriodicalId":363391,"journal":{"name":"Eurographics Symposium on Rendering","volume":"193 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114170912","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 55
Texture tiling on arbitrary topological surfaces using wang tiles 纹理平铺在任意拓扑表面使用王瓷砖
Eurographics Symposium on Rendering Pub Date : 2005-06-29 DOI: 10.2312/EGWR/EGSR05/099-104
Chi-Wing Fu, Man-Kang Leung
{"title":"Texture tiling on arbitrary topological surfaces using wang tiles","authors":"Chi-Wing Fu, Man-Kang Leung","doi":"10.2312/EGWR/EGSR05/099-104","DOIUrl":"https://doi.org/10.2312/EGWR/EGSR05/099-104","url":null,"abstract":"Synthesizing textures on arbitrary surfaces is a time consuming process. We have to analyze the surface geometry and map texture values onto the input surface adaptively. Texture tiling provides an alternative approach by decoupling the texture synthesis process into two steps: surface mapping and tile placement. This paper reformulates the texture tiling mechanism of Wang tiles for arbitrary topological surfaces. Once we created a low distortion conformal map from the input surface to a quad-based geometry, we can generate a tiling graph over the geometric dual graph of the quad-based geometry, and produce a proper tile orientation on all quad faces so that we can layout textured tiles on quads and map texture back to the input surface accordingly. Since tile placement is independent of the input surface geometry, we can perform the tiling process in no time and change texture pattern on the input surface simply by switching a tile set. No additional computation is needed. As a demonstration, we experimented texture tiling of Wang tiles on spheres, polycubes, as well as polycube-mapped models.","PeriodicalId":363391,"journal":{"name":"Eurographics Symposium on Rendering","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129521863","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 50
Non-linear volume photon mapping 非线性体光子映射
Eurographics Symposium on Rendering Pub Date : 2005-06-29 DOI: 10.2312/EGWR/EGSR05/291-300
D. Gutierrez, A. Muñoz, Oscar Anson, F. Serón
{"title":"Non-linear volume photon mapping","authors":"D. Gutierrez, A. Muñoz, Oscar Anson, F. Serón","doi":"10.2312/EGWR/EGSR05/291-300","DOIUrl":"https://doi.org/10.2312/EGWR/EGSR05/291-300","url":null,"abstract":"This paper describes a novel extension of the photon mapping algorithm, capable of handling both volume multiple inelastic scattering and curved light paths simultaneously. The extension is based on the Full Radiative Transfer Equation (FRTE) and Fermat's law, and yields physically accurate, high-dynamic data than can be used for image generation or for other simulation purposes, such as driving simulators, underwater vision or lighting studies in architecture. Photons are traced into the participating medium with a varying index of refraction, and their curved trajectories followed (curved paths are the cause of certain atmospheric effects such as mirages or rippling desert images). Every time a photon is absorbed, a Russian roulette algorithm based on the quantum efficiency of the medium determines whether the inelastic scattering event takes place (causing volume fluorescence). The simulation of both underwater and atmospheric effects is shown, providing a global illumination solution without the restrictions of previous approaches.","PeriodicalId":363391,"journal":{"name":"Eurographics Symposium on Rendering","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134149743","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 46
Interactive system for dynamic scene lighting using captured video environment maps 使用捕获的视频环境地图进行动态场景照明的交互式系统
Eurographics Symposium on Rendering Pub Date : 2005-06-29 DOI: 10.2312/EGWR/EGSR05/031-042
V. Havran, M. Smyk, Grzegorz Krawczyk, K. Myszkowski, H. Seidel
{"title":"Interactive system for dynamic scene lighting using captured video environment maps","authors":"V. Havran, M. Smyk, Grzegorz Krawczyk, K. Myszkowski, H. Seidel","doi":"10.2312/EGWR/EGSR05/031-042","DOIUrl":"https://doi.org/10.2312/EGWR/EGSR05/031-042","url":null,"abstract":"We present an interactive system for fully dynamic scene lighting using captured high dynamic range (HDR) video environment maps. The key component of our system is an algorithm for efficient decomposition of HDR video environment map captured over hemisphere into a set of representative directional light sources, which can be used for the direct lighting computation with shadows using graphics hardware. The resulting lights exhibit good temporal coherence and their number can be adaptively changed to keep a constant framerate while good spatial distribution (stratification) properties are maintained. We can handle a large number of light sources with shadows using a novel technique which reduces the cost of BRDF-based shading and visibility computations. We demonstrate the use of our system in a mixed reality application in which real and synthetic objects are illuminated by consistent lighting at interactive framerates.","PeriodicalId":363391,"journal":{"name":"Eurographics Symposium on Rendering","volume":"208 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130285206","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 45
Bidirectional importance sampling for direct illumination 直接照明的双向重要采样
Eurographics Symposium on Rendering Pub Date : 2005-06-29 DOI: 10.2312/EGWR/EGSR05/147-156
David Burke, A. Ghosh, W. Heidrich
{"title":"Bidirectional importance sampling for direct illumination","authors":"David Burke, A. Ghosh, W. Heidrich","doi":"10.2312/EGWR/EGSR05/147-156","DOIUrl":"https://doi.org/10.2312/EGWR/EGSR05/147-156","url":null,"abstract":"Image-based representations for illumination can capture complex real-world lighting that is difficult to represent in other forms. Current importance sampling strategies for image-based illumination have difficulties in cases where both the illumination and the surface BRDF contain important high-frequency detail – for example, when a specular surface is illuminated by an environment map containing small light sources.\u0000 We introduce the notion of bidirectional importance sampling, in which samples are drawn from the product distribution of both the surface reflectance and the light source energy. While this approach makes the sample selection process more expensive, we drastically reduce the number of visibility tests required to obtain good image quality. As a consequence, we achieve significant quality improvements over previous sampling strategies for the same compute time.","PeriodicalId":363391,"journal":{"name":"Eurographics Symposium on Rendering","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130813671","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 90
Real illumination from virtual environments 来自虚拟环境的真实照明
Eurographics Symposium on Rendering Pub Date : 2005-06-29 DOI: 10.2312/EGWR/EGSR05/243-252
A. Ghosh, Matthew Trentacoste, H. Seetzen, W. Heidrich
{"title":"Real illumination from virtual environments","authors":"A. Ghosh, Matthew Trentacoste, H. Seetzen, W. Heidrich","doi":"10.2312/EGWR/EGSR05/243-252","DOIUrl":"https://doi.org/10.2312/EGWR/EGSR05/243-252","url":null,"abstract":"We introduce a method for actively controlling the illumination in a room so that it is consistent with a virtual world. In combination with a high dynamic range display, the system produces both uniform and directional illumination at intensity levels covering a wide range of real-world environments. It thereby allows natural adaptation processes of the human visual system to take place, for example when moving between bright and dark environments. In addition, the directional illumination provides additional information about the environment in the user's peripheral field of view.\u0000 We describe both the hardware and the software aspects of our system. We also conducted an informal survey to determine whether users prefer the dynamic illumination over constant room illumination in an entertainment setting.","PeriodicalId":363391,"journal":{"name":"Eurographics Symposium on Rendering","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131924644","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Geometric clustering for line drawing simplification 用于线条绘制简化的几何聚类
Eurographics Symposium on Rendering Pub Date : 2005-06-29 DOI: 10.2312/EGWR/EGSR05/183-192
Pascal Barla, J. Thollot, F. Sillion
{"title":"Geometric clustering for line drawing simplification","authors":"Pascal Barla, J. Thollot, F. Sillion","doi":"10.2312/EGWR/EGSR05/183-192","DOIUrl":"https://doi.org/10.2312/EGWR/EGSR05/183-192","url":null,"abstract":"We present a new approach to the simplification of line drawings, in which a smaller set of lines is created to represent the geometry of the original lines. An important feature of our method is that it maintains the morphological structure of the original drawing while allowing user-defined decisions about the appearance of lines. The technique works by analyzing the structure of the drawing at a certain scale and identifying clusters of lines that can be merged given a specific error threshold. These clusters are then processed to create new lines, in a separate stage where different behaviors can be favored based on the application. Successful results are presented for a variety of drawings including scanned and vectorized artwork, original vector drawings, drawings created from 3d models, and hatching marks. The clustering technique is shown to be effective in all these situations.","PeriodicalId":363391,"journal":{"name":"Eurographics Symposium on Rendering","volume":"698 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116978077","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Adaptive frameless rendering 自适应无帧渲染
Eurographics Symposium on Rendering Pub Date : 2005-06-29 DOI: 10.2312/EGWR/EGSR05/265-275
Abhinav Dayal, Cliff Woolley, B. Watson, D. Luebke
{"title":"Adaptive frameless rendering","authors":"Abhinav Dayal, Cliff Woolley, B. Watson, D. Luebke","doi":"10.2312/EGWR/EGSR05/265-275","DOIUrl":"https://doi.org/10.2312/EGWR/EGSR05/265-275","url":null,"abstract":"We propose an adaptive form of frameless rendering with the potential to dramatically increase rendering speed over conventional interactive rendering approaches. Without the rigid sampling patterns of framed renderers, sampling and reconstruction can adapt with very fine granularity to spatio-temporal color change. A sampler uses closed-loop feedback to guide sampling toward edges or motion in the image. Temporally deep buffers store all the samples created over a short time interval for use in reconstruction and as sampler feedback. GPU-based reconstruction responds both to sampling density and space-time color gradients. Where the displayed scene is static, spatial color change dominates and older samples are given significant weight in reconstruction, resulting in sharper and eventually antialiased images. Where the scene is dynamic, more recent samples are emphasized, resulting in less sharp but more up-to-date images. We also use sample reprojection to improve reconstruction and guide sampling toward occlusion edges, undersampled regions, and specular highlights. In simulation our frameless renderer requires an order of magnitude fewer samples than traditional rendering of similar visual quality (as measured by RMS error), while introducing overhead amounting to 15% of computation time.","PeriodicalId":363391,"journal":{"name":"Eurographics Symposium on Rendering","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115505922","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 92
Radiance cache splatting: a GPU-friendly global illumination algorithm 亮度缓存飞溅:一个gpu友好的全局照明算法
Eurographics Symposium on Rendering Pub Date : 2005-06-29 DOI: 10.2312/EGWR/EGSR05/055-064
Pascal Gautron, Jaroslav Křivánek, K. Bouatouch, S. Pattanaik
{"title":"Radiance cache splatting: a GPU-friendly global illumination algorithm","authors":"Pascal Gautron, Jaroslav Křivánek, K. Bouatouch, S. Pattanaik","doi":"10.2312/EGWR/EGSR05/055-064","DOIUrl":"https://doi.org/10.2312/EGWR/EGSR05/055-064","url":null,"abstract":"Fast global illumination computation is a challenge in several fields such as lighting simulation and computergenerated visual effects for movies. To this end, the irradiance caching algorithm is commonly used since it provides high-quality rendering in a reasonable time. However this algorithm relies on a spatial data structure in which nearest-neighbors queries and data insertions are performed alternately within a single rendering step. Due to this central and permanently modified data structure, the irradiance caching algorithm cannot be easily implemented on graphics hardware. This paper proposes a novel approach to global illumination using irradiance and radiance cache: the radiance cache splatting. This method directly meets the processing constraints of graphics hardware since it avoids the need of complex data structure and algorithms. Moreover, the rendering quality remains identical to classical irradiance and radiance caching. Our renderer shows an implementation of our algorithm which provides a significant speedup compared to classical irradiance caching.","PeriodicalId":363391,"journal":{"name":"Eurographics Symposium on Rendering","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117252047","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 84
Reflectance sharing: image-based rendering from a sparse set of images 反射共享:基于稀疏图像集的图像渲染
Eurographics Symposium on Rendering Pub Date : 2005-06-29 DOI: 10.2312/EGWR/EGSR05/253-264
Todd E. Zickler, S. Enrique, R. Ramamoorthi, P. Belhumeur
{"title":"Reflectance sharing: image-based rendering from a sparse set of images","authors":"Todd E. Zickler, S. Enrique, R. Ramamoorthi, P. Belhumeur","doi":"10.2312/EGWR/EGSR05/253-264","DOIUrl":"https://doi.org/10.2312/EGWR/EGSR05/253-264","url":null,"abstract":"When the shape of an object is known, its appearance is determined by the spatially-varying reflectance function defined on its surface. Image-based rendering methods that use geometry seek to estimate this function from image data. Most existing methods recover a unique angular reflectance function (e.g., BRDF) at each surface point and provide reflectance estimates with high spatial resolution. Their angular accuracy is limited by the number of available images, and as a result, most of these methods focus on capturing parametric or low-frequency angular reflectance effects, or allowing only one of lighting or viewpoint variation. We present an alternative approach that enables an increase in the angular accuracy of a spatially-varying reflectance function in exchange for a decrease in spatial resolution. By framing the problem as scattered-data interpolation in a mixed spatial and angular domain, reflectance information is shared across the surface, exploiting the high spatial resolution that images provide to fill the holes between sparsely observed view and lighting directions. Since the BRDF typically varies slowly from point to point over much of an object's surface, this method enables image-based rendering from a sparse set of images without assuming a parametric reflectance model. In fact, the method can even be applied in the limiting case of a single input image.","PeriodicalId":363391,"journal":{"name":"Eurographics Symposium on Rendering","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126438361","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 52
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信