Adaptive Spatially Aware I/O for Multiresolution Particle Data Layouts

W. Usher, Xuan Huang, Steve Petruzza, Sidharth Kumar, S. Slattery, S. Reeve, Feng Wang, Christopher R. Johnson, Valerio Pascucci
{"title":"Adaptive Spatially Aware I/O for Multiresolution Particle Data Layouts","authors":"W. Usher, Xuan Huang, Steve Petruzza, Sidharth Kumar, S. Slattery, S. Reeve, Feng Wang, Christopher R. Johnson, Valerio Pascucci","doi":"10.1109/IPDPS49936.2021.00063","DOIUrl":null,"url":null,"abstract":"Large-scale simulations on nonuniform particle distributions that evolve over time are widely used in cosmology, molecular dynamics, and engineering. Such data are often saved in an unstructured format that neither preserves spatial locality nor provides metadata for accelerating spatial or attribute subset queries, leading to poor performance of visualization tasks. Furthermore, the parallel I/O strategy used typically writes a file per process or a single shared file, neither of which is portable or scalable across different HPC systems. We present a portable technique for scalable, spatially aware adaptive aggregation that preserves spatial locality in the output. We evaluate our approach on two supercomputers, Stampede2 and Summit, and demonstrate that it outperforms prior approaches at scale, achieving up to $2.5 \\times$ faster writes and reads for nonuniform distributions. Furthermore, the layout written by our method is directly suitable for visual analytics, supporting low-latency reads and attribute-based filtering with little overhead.","PeriodicalId":372234,"journal":{"name":"2021 IEEE International Parallel and Distributed Processing Symposium (IPDPS)","volume":"116 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Parallel and Distributed Processing Symposium (IPDPS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IPDPS49936.2021.00063","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Large-scale simulations on nonuniform particle distributions that evolve over time are widely used in cosmology, molecular dynamics, and engineering. Such data are often saved in an unstructured format that neither preserves spatial locality nor provides metadata for accelerating spatial or attribute subset queries, leading to poor performance of visualization tasks. Furthermore, the parallel I/O strategy used typically writes a file per process or a single shared file, neither of which is portable or scalable across different HPC systems. We present a portable technique for scalable, spatially aware adaptive aggregation that preserves spatial locality in the output. We evaluate our approach on two supercomputers, Stampede2 and Summit, and demonstrate that it outperforms prior approaches at scale, achieving up to $2.5 \times$ faster writes and reads for nonuniform distributions. Furthermore, the layout written by our method is directly suitable for visual analytics, supporting low-latency reads and attribute-based filtering with little overhead.
多分辨率粒子数据布局的自适应空间感知I/O
对随时间演化的非均匀粒子分布的大规模模拟广泛应用于宇宙学、分子动力学和工程学。这些数据通常以非结构化格式保存,既不保留空间局部性,也不提供元数据来加速空间或属性子集查询,从而导致可视化任务的性能较差。此外,所使用的并行I/O策略通常为每个进程或单个共享文件写入一个文件,这两种策略都不能在不同的HPC系统之间移植或扩展。我们提出了一种可扩展的、空间感知的自适应聚合的便携式技术,该技术保留了输出中的空间局域性。我们在两台超级计算机Stampede2和Summit上评估了我们的方法,并证明它在规模上优于先前的方法,在非均匀分布下实现了高达2.5倍的读写速度。此外,我们的方法编写的布局直接适用于可视化分析,支持低延迟读取和基于属性的过滤,开销很小。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信