Forecasting of Global Ionosphere Maps With Multi-Day Lead Time Using Transformer-Based Neural Networks

IF 3.7 2区 地球科学
Space Weather Pub Date : 2024-01-30 DOI:10.1029/2023sw003579
Chung-Yu Shih, Cissi Ying-tsen Lin, Shu-Yu Lin, Cheng-Hung Yeh, Yu-Ming Huang, Feng-Nan Hwang, Chia-Hui Chang
{"title":"Forecasting of Global Ionosphere Maps With Multi-Day Lead Time Using Transformer-Based Neural Networks","authors":"Chung-Yu Shih, Cissi Ying-tsen Lin, Shu-Yu Lin, Cheng-Hung Yeh, Yu-Ming Huang, Feng-Nan Hwang, Chia-Hui Chang","doi":"10.1029/2023sw003579","DOIUrl":null,"url":null,"abstract":"Ionospheric total electron content (TEC) is a key indicator of the space environment. Geophysical forcing from above and below drives its spatial and temporal variations. A full understanding of physical and chemical principles, available and well-representable driving inputs, and capable computational power are required for physical models to reproduce simulations that agree with observations, which may be challenging at times. Recently, data-driven approaches, such as deep learning, have therefore surged as means for TEC prediction. Owing to the fact that the geophysical world possesses a sequential nature in time and space, Transformer architectures are proposed and evaluated for sequence-to-sequence TEC predictions in this study. We discuss the impacts of time lengths of choice during the training process and analyze what the neural network has learned regarding the data sets. Our results suggest that 12-layer, 128-hidden-unit Transformer architectures sufficiently provide multi-step global TEC predictions for 48 hr with an overall root-mean-square error (RMSE) of ∼1.8 TECU. The hourly variation of RMSE increases from 0.6 TECU to about 2.0 TECU during the prediction time frame.","PeriodicalId":22181,"journal":{"name":"Space Weather","volume":null,"pages":null},"PeriodicalIF":3.7000,"publicationDate":"2024-01-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Space Weather","FirstCategoryId":"89","ListUrlMain":"https://doi.org/10.1029/2023sw003579","RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Ionospheric total electron content (TEC) is a key indicator of the space environment. Geophysical forcing from above and below drives its spatial and temporal variations. A full understanding of physical and chemical principles, available and well-representable driving inputs, and capable computational power are required for physical models to reproduce simulations that agree with observations, which may be challenging at times. Recently, data-driven approaches, such as deep learning, have therefore surged as means for TEC prediction. Owing to the fact that the geophysical world possesses a sequential nature in time and space, Transformer architectures are proposed and evaluated for sequence-to-sequence TEC predictions in this study. We discuss the impacts of time lengths of choice during the training process and analyze what the neural network has learned regarding the data sets. Our results suggest that 12-layer, 128-hidden-unit Transformer architectures sufficiently provide multi-step global TEC predictions for 48 hr with an overall root-mean-square error (RMSE) of ∼1.8 TECU. The hourly variation of RMSE increases from 0.6 TECU to about 2.0 TECU during the prediction time frame.
利用基于变压器的神经网络预测多天前沿时间的全球电离层地图
电离层电子总含量(TEC)是空间环境的一个关键指标。来自上层和下层的地球物理作用力驱动着它的时空变化。物理模型要重现与观测结果一致的模拟结果,需要对物理和化学原理有充分的了解,有可用的、可很好反映的驱动输入,以及强大的计算能力,而这有时可能具有挑战性。因此,最近数据驱动的方法(如深度学习)已成为 TEC 预测的重要手段。由于地球物理世界在时间和空间上具有顺序性,本研究提出并评估了用于顺序到顺序 TEC 预测的 Transformer 架构。我们讨论了在训练过程中选择时间长度的影响,并分析了神经网络在数据集方面的学习成果。我们的研究结果表明,12 层、128 个隐藏单元的 Transformer 架构可在 48 小时内充分提供多步骤全局 TEC 预测,总体均方根误差 (RMSE) 为 1.8 TECU。在预测期间,RMSE 的小时变化从 0.6 TECU 增加到约 2.0 TECU。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
29.70%
发文量
166
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信