{"title":"大规模21厘米光锥图像的多保真仿真:一种基于生成对抗网络的少镜头迁移学习方法","authors":"Kangning Diao and Yi Mao","doi":"10.3847/1538-4357/ae0325","DOIUrl":null,"url":null,"abstract":"Emulators using machine learning techniques have emerged to efficiently generate mock data matching the large survey volume for upcoming experiments, as an alternative approach to large-scale numerical simulations. However, high-fidelity emulators have become computationally expensive as the simulation volume grows to hundreds of megaparsecs. Here, we present a multifidelity emulation of large-scale 21 cm lightcone images from the epoch of reionization, which is realized by applying the few-shot transfer learning to training generative adversarial networks (GAN) from small-scale to large-scale simulations. Specifically, a GAN emulator is first trained with a huge number of small-scale simulations, and then transfer-learned with only a limited number of large-scale simulations, to emulate large-scale 21 cm lightcone images. We test the precision of our transfer-learned GAN emulator in terms of representative statistics including global 21 cm brightness temperature history, 2D power spectrum, and scattering transform coefficients. We demonstrate that the lightcone images generated by the transfer-learned GAN emulator can reach the percentage level precision in most cases on small scales, and the error on large scales only increases mildly to the level of a few tens of percent. Nevertheless, our multifidelity emulation technique saves a significant portion of computational resources that are mostly consumed for generating training samples for GAN. On estimate, the computational resource by training GAN completely with large-scale simulations would be 1 to 2 orders of magnitude larger than using our multifidelity technique. This implies that our technique allows for emulating high-fidelity, traditionally computationally prohibitive, images in an economic manner.","PeriodicalId":501813,"journal":{"name":"The Astrophysical Journal","volume":"124 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2025-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multifidelity Emulator for Large-scale 21 cm Lightcone Images: A Few-shot Transfer Learning Approach with Generative Adversarial Network\",\"authors\":\"Kangning Diao and Yi Mao\",\"doi\":\"10.3847/1538-4357/ae0325\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Emulators using machine learning techniques have emerged to efficiently generate mock data matching the large survey volume for upcoming experiments, as an alternative approach to large-scale numerical simulations. However, high-fidelity emulators have become computationally expensive as the simulation volume grows to hundreds of megaparsecs. Here, we present a multifidelity emulation of large-scale 21 cm lightcone images from the epoch of reionization, which is realized by applying the few-shot transfer learning to training generative adversarial networks (GAN) from small-scale to large-scale simulations. Specifically, a GAN emulator is first trained with a huge number of small-scale simulations, and then transfer-learned with only a limited number of large-scale simulations, to emulate large-scale 21 cm lightcone images. We test the precision of our transfer-learned GAN emulator in terms of representative statistics including global 21 cm brightness temperature history, 2D power spectrum, and scattering transform coefficients. We demonstrate that the lightcone images generated by the transfer-learned GAN emulator can reach the percentage level precision in most cases on small scales, and the error on large scales only increases mildly to the level of a few tens of percent. Nevertheless, our multifidelity emulation technique saves a significant portion of computational resources that are mostly consumed for generating training samples for GAN. On estimate, the computational resource by training GAN completely with large-scale simulations would be 1 to 2 orders of magnitude larger than using our multifidelity technique. This implies that our technique allows for emulating high-fidelity, traditionally computationally prohibitive, images in an economic manner.\",\"PeriodicalId\":501813,\"journal\":{\"name\":\"The Astrophysical Journal\",\"volume\":\"124 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-10-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The Astrophysical Journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3847/1538-4357/ae0325\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Astrophysical Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3847/1538-4357/ae0325","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Multifidelity Emulator for Large-scale 21 cm Lightcone Images: A Few-shot Transfer Learning Approach with Generative Adversarial Network
Emulators using machine learning techniques have emerged to efficiently generate mock data matching the large survey volume for upcoming experiments, as an alternative approach to large-scale numerical simulations. However, high-fidelity emulators have become computationally expensive as the simulation volume grows to hundreds of megaparsecs. Here, we present a multifidelity emulation of large-scale 21 cm lightcone images from the epoch of reionization, which is realized by applying the few-shot transfer learning to training generative adversarial networks (GAN) from small-scale to large-scale simulations. Specifically, a GAN emulator is first trained with a huge number of small-scale simulations, and then transfer-learned with only a limited number of large-scale simulations, to emulate large-scale 21 cm lightcone images. We test the precision of our transfer-learned GAN emulator in terms of representative statistics including global 21 cm brightness temperature history, 2D power spectrum, and scattering transform coefficients. We demonstrate that the lightcone images generated by the transfer-learned GAN emulator can reach the percentage level precision in most cases on small scales, and the error on large scales only increases mildly to the level of a few tens of percent. Nevertheless, our multifidelity emulation technique saves a significant portion of computational resources that are mostly consumed for generating training samples for GAN. On estimate, the computational resource by training GAN completely with large-scale simulations would be 1 to 2 orders of magnitude larger than using our multifidelity technique. This implies that our technique allows for emulating high-fidelity, traditionally computationally prohibitive, images in an economic manner.