I.P. Kemp , N.M. Pingel , R. Worth , J. Wake , D.A. Mitchell , S.D. Midgely , S.J. Tingay , J. Dempsey , H. Dénes , J.M. Dickey , S.J. Gibson , K.E. Jameson , C. Lynn , Y.K. Ma , A. Marchal , N.M. McClure-Griffiths , S. Stanimirović , J. Th. van Loon
{"title":"Processing of GASKAP-Hi pilot survey data using a commercial supercomputer","authors":"I.P. Kemp , N.M. Pingel , R. Worth , J. Wake , D.A. Mitchell , S.D. Midgely , S.J. Tingay , J. Dempsey , H. Dénes , J.M. Dickey , S.J. Gibson , K.E. Jameson , C. Lynn , Y.K. Ma , A. Marchal , N.M. McClure-Griffiths , S. Stanimirović , J. Th. van Loon","doi":"10.1016/j.ascom.2024.100901","DOIUrl":null,"url":null,"abstract":"<div><div>Modern radio telescopes generate large amounts of data, with the next generation Very Large Array (ngVLA) and the Square Kilometre Array (SKA) expected to feed up to 292 GB of visibilities per second to the science data processor (SDP). However, the continued exponential growth in the power of the world’s largest supercomputers suggests that for the foreseeable future there will be sufficient capacity available to provide for astronomers’ needs in processing ‘science ready’ products from the new generation of telescopes, with commercial platforms becoming an option for overflow capacity. The purpose of the current work is to trial the use of commercial high performance computing (HPC) for a large scale processing task in astronomy, in this case processing data from the GASKAP-H<span>i</span> pilot surveys. We delineate a four-step process which can be followed by other researchers wishing to port an existing workflow from a public facility to a commercial provider. We used the process to provide reference images for an ongoing upgrade to ASKAPSoft (the ASKAP SDP software), and to provide science images for the GASKAP collaboration, using the joint deconvolution capability of WSClean. We document the approach to optimising the pipeline to minimise cost and elapsed time at the commercial provider, and give a resource estimate for processing future full survey data. Finally we document advantages, disadvantages, and lessons learned from the project, which will aid other researchers aiming to use commercial supercomputing for radio astronomy imaging. We found the key advantage to be immediate access and high availability, and the main disadvantage to be the need for improved HPC knowledge to take best advantage of the facility.</div></div>","PeriodicalId":48757,"journal":{"name":"Astronomy and Computing","volume":"51 ","pages":"Article 100901"},"PeriodicalIF":1.9000,"publicationDate":"2024-11-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Astronomy and Computing","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2213133724001161","RegionNum":4,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0
Abstract
Modern radio telescopes generate large amounts of data, with the next generation Very Large Array (ngVLA) and the Square Kilometre Array (SKA) expected to feed up to 292 GB of visibilities per second to the science data processor (SDP). However, the continued exponential growth in the power of the world’s largest supercomputers suggests that for the foreseeable future there will be sufficient capacity available to provide for astronomers’ needs in processing ‘science ready’ products from the new generation of telescopes, with commercial platforms becoming an option for overflow capacity. The purpose of the current work is to trial the use of commercial high performance computing (HPC) for a large scale processing task in astronomy, in this case processing data from the GASKAP-Hi pilot surveys. We delineate a four-step process which can be followed by other researchers wishing to port an existing workflow from a public facility to a commercial provider. We used the process to provide reference images for an ongoing upgrade to ASKAPSoft (the ASKAP SDP software), and to provide science images for the GASKAP collaboration, using the joint deconvolution capability of WSClean. We document the approach to optimising the pipeline to minimise cost and elapsed time at the commercial provider, and give a resource estimate for processing future full survey data. Finally we document advantages, disadvantages, and lessons learned from the project, which will aid other researchers aiming to use commercial supercomputing for radio astronomy imaging. We found the key advantage to be immediate access and high availability, and the main disadvantage to be the need for improved HPC knowledge to take best advantage of the facility.
Astronomy and ComputingASTRONOMY & ASTROPHYSICSCOMPUTER SCIENCE,-COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
CiteScore
4.10
自引率
8.00%
发文量
67
期刊介绍:
Astronomy and Computing is a peer-reviewed journal that focuses on the broad area between astronomy, computer science and information technology. The journal aims to publish the work of scientists and (software) engineers in all aspects of astronomical computing, including the collection, analysis, reduction, visualisation, preservation and dissemination of data, and the development of astronomical software and simulations. The journal covers applications for academic computer science techniques to astronomy, as well as novel applications of information technologies within astronomy.