Galley: Modern Query Optimization for Sparse Tensor Programs

Kyle Deeds, Willow Ahrens, Magda Balazinska, Dan Suciu
{"title":"Galley: Modern Query Optimization for Sparse Tensor Programs","authors":"Kyle Deeds, Willow Ahrens, Magda Balazinska, Dan Suciu","doi":"arxiv-2408.14706","DOIUrl":null,"url":null,"abstract":"The tensor programming abstraction has become the key . This framework allows\nusers to write high performance programs for bulk computation via a high-level\nimperative interface. Recent work has extended this paradigm to sparse tensors\n(i.e. tensors where most entries are not explicitly represented) with the use\nof sparse tensor compilers. These systems excel at producing efficient code for\ncomputation over sparse tensors, which may be stored in a wide variety of\nformats. However, they require the user to manually choose the order of\noperations and the data formats at every step. Unfortunately, these decisions\nare both highly impactful and complicated, requiring significant effort to\nmanually optimize. In this work, we present Galley, a system for declarative\nsparse tensor programming. Galley performs cost-based optimization to lower\nthese programs to a logical plan then to a physical plan. It then leverages\nsparse tensor compilers to execute the physical plan efficiently. We show that\nGalley achieves high performance on a wide variety of problems including\nmachine learning algorithms, subgraph counting, and iterative graph algorithms.","PeriodicalId":501197,"journal":{"name":"arXiv - CS - Programming Languages","volume":"59 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Programming Languages","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.14706","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The tensor programming abstraction has become the key . This framework allows users to write high performance programs for bulk computation via a high-level imperative interface. Recent work has extended this paradigm to sparse tensors (i.e. tensors where most entries are not explicitly represented) with the use of sparse tensor compilers. These systems excel at producing efficient code for computation over sparse tensors, which may be stored in a wide variety of formats. However, they require the user to manually choose the order of operations and the data formats at every step. Unfortunately, these decisions are both highly impactful and complicated, requiring significant effort to manually optimize. In this work, we present Galley, a system for declarative sparse tensor programming. Galley performs cost-based optimization to lower these programs to a logical plan then to a physical plan. It then leverages sparse tensor compilers to execute the physical plan efficiently. We show that Galley achieves high performance on a wide variety of problems including machine learning algorithms, subgraph counting, and iterative graph algorithms.
Galley:稀疏张量程序的现代查询优化
张量编程抽象已成为.NET技术的关键。这一框架允许用户通过高级操作界面编写用于批量计算的高性能程序。最近的研究利用稀疏张量编译器将这一范式扩展到了稀疏张量(即大部分条目没有明确表示的张量)。这些系统擅长为稀疏张量的计算生成高效代码,稀疏张量可以以多种格式存储。然而,它们要求用户在每一步都手动选择操作顺序和数据格式。遗憾的是,这些决定既影响大又复杂,需要花费大量精力手动优化。在这项工作中,我们提出了一个用于声明解析张量编程的系统 Galley。Galley 执行基于成本的优化,将这些程序降低为逻辑计划,然后再降低为物理计划。然后,它利用稀疏张量编译器高效执行物理计划。我们的研究表明,Galley 在包括机器学习算法、子图计数和迭代图算法在内的各种问题上都取得了很高的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信