{"title":"Africanus I. Scalable, distributed and efficient radio data processing with Dask-MS and Codex Africanus","authors":"S.J. Perkins , J.S. Kenyon , L.A.L. Andati , H.L. Bester , O.M. Smirnov , B.V. Hugo","doi":"10.1016/j.ascom.2025.100958","DOIUrl":null,"url":null,"abstract":"<div><div>The physical configuration of new radio interferometers such as MeerKAT, SKA, ngVLA and DSA-2000 informs the development of software in two important areas. Firstly, tractably processing the sheer quantity of data produced by new instruments necessitates subdivision and processing on multiple nodes. Secondly, the sensitivity inherent in modern instruments due to improved engineering practices and greater data quantities necessitates the development of new techniques to capitalize on the enhanced sensitivity of modern interferometers.</div><div>This produces a critical tension in radio astronomy software development: a fully optimized pipeline is desirable for producing science products in a tractable amount of time, but the design requirements for such a pipeline are unlikely to be understood upfront in the context of artefacts unveiled by greater instrument sensitivity. Therefore, new techniques must continuously be developed to address these artefacts and integrated into a full pipeline. As Knuth reminds us, “Premature optimization is the root of all evil”. This necessitates a fundamental trade-off between a trifecta of (1) performant code (2) flexibility and (3) ease-of-development. At one end of the spectrum, rigid design requirements are unlikely to capture the full scope of the problem, while throw-away research code is unsuitable for production use.</div><div>This work proposes a framework for the development of radio astronomy techniques within the above trifecta. In doing so, we favour flexibility and ease-of-development over performance, but this does not necessarily mean that the software developed within this framework is slow. Practically this translates to using data formats and software from the Open Source Community. For example, by using <span>NumPy</span> arrays and/or <span>Pandas</span> dataframes, a plethora of algorithms immediately become available to the scientific developer.</div><div>Focusing on performance, the breakdown of Moore’s Law in the 2010s and the resultant growth of both multi-core and distributed (including cloud) computing, a fundamental shift in the writing of radio astronomy algorithms and the storage of data is required: It is necessary to <em>shard</em> data over multiple processors and compute nodes, and to write algorithms that operate on these shards in parallel. The growth in data volumes compounds this requirement. Given the fundamental shift in compute architecture we believe this is central to the performance of any framework going forward, and is given especial emphasis in this one.</div><div>This paper describes two Python libraries, <span>Dask-MS</span> and <span>codex africanus</span> <!--> <!-->which enable the development of distributed High-Performance radio astronomy code with <span>Dask</span>. <span>Dask</span> is a lightweight Python parallelization and distribution framework that seamlessly integrates with the <span>PyData</span> ecosystem to address radio astronomy “Big Data“ challenges.</div></div>","PeriodicalId":48757,"journal":{"name":"Astronomy and Computing","volume":"52 ","pages":"Article 100958"},"PeriodicalIF":1.9000,"publicationDate":"2025-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Astronomy and Computing","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2213133725000319","RegionNum":4,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0
Abstract
The physical configuration of new radio interferometers such as MeerKAT, SKA, ngVLA and DSA-2000 informs the development of software in two important areas. Firstly, tractably processing the sheer quantity of data produced by new instruments necessitates subdivision and processing on multiple nodes. Secondly, the sensitivity inherent in modern instruments due to improved engineering practices and greater data quantities necessitates the development of new techniques to capitalize on the enhanced sensitivity of modern interferometers.
This produces a critical tension in radio astronomy software development: a fully optimized pipeline is desirable for producing science products in a tractable amount of time, but the design requirements for such a pipeline are unlikely to be understood upfront in the context of artefacts unveiled by greater instrument sensitivity. Therefore, new techniques must continuously be developed to address these artefacts and integrated into a full pipeline. As Knuth reminds us, “Premature optimization is the root of all evil”. This necessitates a fundamental trade-off between a trifecta of (1) performant code (2) flexibility and (3) ease-of-development. At one end of the spectrum, rigid design requirements are unlikely to capture the full scope of the problem, while throw-away research code is unsuitable for production use.
This work proposes a framework for the development of radio astronomy techniques within the above trifecta. In doing so, we favour flexibility and ease-of-development over performance, but this does not necessarily mean that the software developed within this framework is slow. Practically this translates to using data formats and software from the Open Source Community. For example, by using NumPy arrays and/or Pandas dataframes, a plethora of algorithms immediately become available to the scientific developer.
Focusing on performance, the breakdown of Moore’s Law in the 2010s and the resultant growth of both multi-core and distributed (including cloud) computing, a fundamental shift in the writing of radio astronomy algorithms and the storage of data is required: It is necessary to shard data over multiple processors and compute nodes, and to write algorithms that operate on these shards in parallel. The growth in data volumes compounds this requirement. Given the fundamental shift in compute architecture we believe this is central to the performance of any framework going forward, and is given especial emphasis in this one.
This paper describes two Python libraries, Dask-MS and codex africanus which enable the development of distributed High-Performance radio astronomy code with Dask. Dask is a lightweight Python parallelization and distribution framework that seamlessly integrates with the PyData ecosystem to address radio astronomy “Big Data“ challenges.
Astronomy and ComputingASTRONOMY & ASTROPHYSICSCOMPUTER SCIENCE,-COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
CiteScore
4.10
自引率
8.00%
发文量
67
期刊介绍:
Astronomy and Computing is a peer-reviewed journal that focuses on the broad area between astronomy, computer science and information technology. The journal aims to publish the work of scientists and (software) engineers in all aspects of astronomical computing, including the collection, analysis, reduction, visualisation, preservation and dissemination of data, and the development of astronomical software and simulations. The journal covers applications for academic computer science techniques to astronomy, as well as novel applications of information technologies within astronomy.