Ray-traced Shell Traversal of Tetrahedral Meshes for Direct Volume Visualization

Alper Sahistan, S. Demirci, N. Morrical, Stefan Zellmann, Aytek Aman, I. Wald, U. Güdükbay
{"title":"Ray-traced Shell Traversal of Tetrahedral Meshes for Direct Volume Visualization","authors":"Alper Sahistan, S. Demirci, N. Morrical, Stefan Zellmann, Aytek Aman, I. Wald, U. Güdükbay","doi":"10.1109/VIS49827.2021.9623298","DOIUrl":null,"url":null,"abstract":"A well-known method for rendering unstructured volumetric data is tetrahedral marching (tet marching), where rays are marched through a series of tetrahedral elements. Rowever, existing tet marching techniques do not easily generalize to rays with arbitrary origin and direction required for advanced shading effects or non-convex meshes. Additionally, the memory footprint of these methods may exceed GPU memory limits. Interactive performance and high image quality are opposing goals. Our approach significantly lowers the burden to render unstructured datasets with high image fidelity while maintaining real-time and interactive performance even for large datasets. To this end, we leverage hardware-accelerated ray tracing to find entry and exit faces for a given ray into a volume and utilize a compact mesh representation to enable the efficient marching of arbitrary rays, thus allowing for advanced shading effects that ultimately yields more convincing and grounded images. Our approach is also robust, supporting both convex and non-convex unstructured meshes. We show that our method achieves interactive rates even with moderately-sized datasets while secondary effects are applied.","PeriodicalId":387572,"journal":{"name":"2021 IEEE Visualization Conference (VIS)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE Visualization Conference (VIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VIS49827.2021.9623298","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11

Abstract

A well-known method for rendering unstructured volumetric data is tetrahedral marching (tet marching), where rays are marched through a series of tetrahedral elements. Rowever, existing tet marching techniques do not easily generalize to rays with arbitrary origin and direction required for advanced shading effects or non-convex meshes. Additionally, the memory footprint of these methods may exceed GPU memory limits. Interactive performance and high image quality are opposing goals. Our approach significantly lowers the burden to render unstructured datasets with high image fidelity while maintaining real-time and interactive performance even for large datasets. To this end, we leverage hardware-accelerated ray tracing to find entry and exit faces for a given ray into a volume and utilize a compact mesh representation to enable the efficient marching of arbitrary rays, thus allowing for advanced shading effects that ultimately yields more convincing and grounded images. Our approach is also robust, supporting both convex and non-convex unstructured meshes. We show that our method achieves interactive rates even with moderately-sized datasets while secondary effects are applied.
面向直接体可视化的四面体网格射线追踪壳遍历
呈现非结构化体积数据的一种众所周知的方法是四面体行进(tet marching),其中光线通过一系列四面体元素行进。然而,现有的tet推进技术不容易推广到具有任意原点和方向的光线,需要高级阴影效果或非凸网格。此外,这些方法的内存占用可能超过GPU内存限制。交互性能和高图像质量是对立的目标。我们的方法显著降低了以高保真度呈现非结构化数据集的负担,同时即使对于大型数据集也能保持实时和交互性能。为此,我们利用硬件加速光线跟踪来查找给定光线进入体积的入口和出口面,并利用紧凑的网格表示来实现任意光线的有效行进,从而允许高级阴影效果,最终产生更有说服力和接地的图像。我们的方法也是健壮的,支持凸和非凸非结构化网格。我们表明,即使使用中等大小的数据集,当应用次要效应时,我们的方法也能达到交互率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信