Abraham Arvizu, Alejandro Aviles, Juan Carlos Hidalgo, Eladio Moreno, Gustavo Niz, Mario A. Rodriguez-Meza, Sofía Samario and The LSST Dark Energy Science collaboration
{"title":"Modeling the 3-point correlation function of projected scalar fields on the sphere","authors":"Abraham Arvizu, Alejandro Aviles, Juan Carlos Hidalgo, Eladio Moreno, Gustavo Niz, Mario A. Rodriguez-Meza, Sofía Samario and The LSST Dark Energy Science collaboration","doi":"10.1088/1475-7516/2024/12/049","DOIUrl":null,"url":null,"abstract":"One of the main obstacles for the signal extraction of the three point correlation function using photometric surveys, such as the Rubin Observatory Legacy Survey of Space and Time (LSST), will be the prohibitive computation time required for dealing with a vast quantity of sources. Brute force algorithms, which naively scales as 𝒪(N3) with the number of objects, can be further improved with tree methods but not enough to deal with large scale correlations of Rubin's data. However, a harmonic basis decomposition of these higher order statistics reduces the time dramatically, to scale as a two-point correlation function with the number of objects, so that the signal can be extracted in a reasonable amount of time. In this work, we aim to develop the framework to use these expansions within the Limber approximation for scalar (or spin-0) fields, such as galaxy counts, weak lensing convergence or aperture masses. We develop an estimator to extract the signal from catalogs and different phenomenological and theoretical models for its description. The latter includes halo model and standard perturbation theory, to which we add a simple effective field theory prescription based on the short range of non-locality of cosmic fields, significantly improving the agreement with simulated data. In parallel to the modeling of the signal, we develop a code that can efficiently calculate three points correlations of more than 200 million data points (a full sky simulation with Nside=4096) in ∼40 minutes, or even less than 10 minutes using an approximation in the searching algorithm, on a single high-performance computing node, enabling a feasible analysis for the upcoming LSST data.","PeriodicalId":15445,"journal":{"name":"Journal of Cosmology and Astroparticle Physics","volume":"27 1","pages":""},"PeriodicalIF":5.3000,"publicationDate":"2024-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Cosmology and Astroparticle Physics","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.1088/1475-7516/2024/12/049","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0
Abstract
One of the main obstacles for the signal extraction of the three point correlation function using photometric surveys, such as the Rubin Observatory Legacy Survey of Space and Time (LSST), will be the prohibitive computation time required for dealing with a vast quantity of sources. Brute force algorithms, which naively scales as 𝒪(N3) with the number of objects, can be further improved with tree methods but not enough to deal with large scale correlations of Rubin's data. However, a harmonic basis decomposition of these higher order statistics reduces the time dramatically, to scale as a two-point correlation function with the number of objects, so that the signal can be extracted in a reasonable amount of time. In this work, we aim to develop the framework to use these expansions within the Limber approximation for scalar (or spin-0) fields, such as galaxy counts, weak lensing convergence or aperture masses. We develop an estimator to extract the signal from catalogs and different phenomenological and theoretical models for its description. The latter includes halo model and standard perturbation theory, to which we add a simple effective field theory prescription based on the short range of non-locality of cosmic fields, significantly improving the agreement with simulated data. In parallel to the modeling of the signal, we develop a code that can efficiently calculate three points correlations of more than 200 million data points (a full sky simulation with Nside=4096) in ∼40 minutes, or even less than 10 minutes using an approximation in the searching algorithm, on a single high-performance computing node, enabling a feasible analysis for the upcoming LSST data.
期刊介绍:
Journal of Cosmology and Astroparticle Physics (JCAP) encompasses theoretical, observational and experimental areas as well as computation and simulation. The journal covers the latest developments in the theory of all fundamental interactions and their cosmological implications (e.g. M-theory and cosmology, brane cosmology). JCAP''s coverage also includes topics such as formation, dynamics and clustering of galaxies, pre-galactic star formation, x-ray astronomy, radio astronomy, gravitational lensing, active galactic nuclei, intergalactic and interstellar matter.