Astroconformer: The prospects of analyzing stellar light curves with transformer-based deep learning models

IF 4.7 3区 物理与天体物理 Q1 ASTRONOMY & ASTROPHYSICS
Jia-Shu Pan, Yuan-Sen Ting, Jie Yu
{"title":"Astroconformer: The prospects of analyzing stellar light curves with transformer-based deep learning models","authors":"Jia-Shu Pan, Yuan-Sen Ting, Jie Yu","doi":"10.1093/mnras/stae068","DOIUrl":null,"url":null,"abstract":"Stellar light curves contain valuable information about oscillations and granulation, offering insights into stars’ internal structures and evolutionary states. Traditional asteroseismic techniques, primarily focused on power spectral analysis, often overlook the crucial phase information in these light curves. Addressing this gap, recent machine learning applications, particularly those using Convolutional Neural Networks (CNNs), have made strides in inferring stellar properties from light curves. However, CNNs are limited by their localized feature extraction capabilities. In response, we introduce Astroconformer, a Transformer-based deep learning framework, specifically designed to capture long-range dependencies in stellar light curves. Our empirical analysis centers on estimating surface gravity (log g), using a dataset derived from single-quarter Kepler light curves with log g values ranging from 0.2 to 4.4. Astroconformer demonstrates superior performance, achieving a root-mean-square-error (RMSE) of 0.017 dex at log g ≈ 3 in data-rich regimes and up to 0.1 dex in sparser areas. This performance surpasses both K-nearest neighbor models and advanced CNNs. Ablation studies highlight the influence of receptive field size on model effectiveness, with larger fields correlating to improved results. Astroconformer also excels in extracting νmax with high precision. It achieves less than 2 % relative median absolute error for 90-day red giant light curves. Notably, the error remains under 3 % for 30-day light curves, whose oscillations are undetectable by a conventional pipeline in 30 % cases. Furthermore, the attention mechanisms in Astroconformer align closely with the characteristics of stellar oscillations and granulation observed in light curves.","PeriodicalId":18930,"journal":{"name":"Monthly Notices of the Royal Astronomical Society","volume":"26 1","pages":""},"PeriodicalIF":4.7000,"publicationDate":"2024-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Monthly Notices of the Royal Astronomical Society","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.1093/mnras/stae068","RegionNum":3,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0

Abstract

Stellar light curves contain valuable information about oscillations and granulation, offering insights into stars’ internal structures and evolutionary states. Traditional asteroseismic techniques, primarily focused on power spectral analysis, often overlook the crucial phase information in these light curves. Addressing this gap, recent machine learning applications, particularly those using Convolutional Neural Networks (CNNs), have made strides in inferring stellar properties from light curves. However, CNNs are limited by their localized feature extraction capabilities. In response, we introduce Astroconformer, a Transformer-based deep learning framework, specifically designed to capture long-range dependencies in stellar light curves. Our empirical analysis centers on estimating surface gravity (log g), using a dataset derived from single-quarter Kepler light curves with log g values ranging from 0.2 to 4.4. Astroconformer demonstrates superior performance, achieving a root-mean-square-error (RMSE) of 0.017 dex at log g ≈ 3 in data-rich regimes and up to 0.1 dex in sparser areas. This performance surpasses both K-nearest neighbor models and advanced CNNs. Ablation studies highlight the influence of receptive field size on model effectiveness, with larger fields correlating to improved results. Astroconformer also excels in extracting νmax with high precision. It achieves less than 2 % relative median absolute error for 90-day red giant light curves. Notably, the error remains under 3 % for 30-day light curves, whose oscillations are undetectable by a conventional pipeline in 30 % cases. Furthermore, the attention mechanisms in Astroconformer align closely with the characteristics of stellar oscillations and granulation observed in light curves.
Astroconformer:利用基于转换器的深度学习模型分析恒星光变曲线的前景
恒星光变曲线包含有关振荡和粒化的宝贵信息,有助于深入了解恒星的内部结构和演化状态。传统的星震技术主要侧重于功率谱分析,往往忽略了这些光曲线中至关重要的相位信息。针对这一缺陷,最近的机器学习应用,尤其是使用卷积神经网络(CNN)的应用,在从光变曲线推断恒星属性方面取得了长足进步。然而,卷积神经网络受限于其局部特征提取能力。为此,我们引入了 Astroconformer,这是一个基于 Transformer 的深度学习框架,专门用于捕捉恒星光曲线中的长程依赖关系。我们的实证分析以估计表面引力(log g)为中心,使用的数据集来自单四分之一开普勒光曲线,其log g值从0.2到4.4不等。Astroconformer 表现出卓越的性能,在数据丰富的情况下,log g ≈ 3 时的均方根误差(RMSE)为 0.017 dex,而在数据稀少的情况下则高达 0.1 dex。这一性能超过了 K 近邻模型和高级 CNN。烧蚀研究凸显了感受野大小对模型有效性的影响,感受野越大,结果越好。Astroconformer 在高精度提取 νmax 方面也表现出色。它对 90 天红巨星光变曲线的相对中位绝对误差小于 2%。值得注意的是,30 天光变曲线的误差保持在 3%以下,而在 30%的情况下,传统管道无法检测到其振荡。此外,Astroconformer 中的注意机制与在光曲线中观测到的恒星振荡和粒化特征非常吻合。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
9.10
自引率
37.50%
发文量
3198
审稿时长
3 months
期刊介绍: Monthly Notices of the Royal Astronomical Society is one of the world''s leading primary research journals in astronomy and astrophysics, as well as one of the longest established. It publishes the results of original research in positional and dynamical astronomy, astrophysics, radio astronomy, cosmology, space research and the design of astronomical instruments.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信