Saifei Ma , Tiantian Zhang , Haibo Wang , Haoyu Wang , Nan Li , Haiwen Zhu , Jianjun Zhu , Jianli Wang
{"title":"Transformer-based forecasting for high-frequency natural gas production data","authors":"Saifei Ma , Tiantian Zhang , Haibo Wang , Haoyu Wang , Nan Li , Haiwen Zhu , Jianjun Zhu , Jianli Wang","doi":"10.1016/j.egyai.2025.100535","DOIUrl":null,"url":null,"abstract":"<div><div>Accurate prediction of natural gas well production data is crucial for effective resource management and innovation, particularly amid the global transition to sustainable energy. Traditional models struggle with high-frequency, high-dimensional datasets generated by digital transformation in the oil and gas industry. This study explores the application of Transformer-based models — Transformer, Informer, Autoformer, and Patch Time Series Transformer (PatchTST) — for forecasting high-frequency natural gas production data. These models utilize self-attention mechanisms to capture long-term dependencies and efficiently process large-scale datasets. Autoformer achieves predictive success through its Seasonal Decomposition Attention mechanism, which effectively extracts trend-seasonality patterns. However, our experiments show that Autoformer exhibits sensitivity to dataset changes, as performance declines when using old parameters compared to retrained models, highlighting its reliance on dataset-specific retraining. Experimental results demonstrate that increasing sampling frequency significantly enhances prediction accuracy, reducing MAPE from 0.556 to 0.239. Additionally, these models consistently track actual production trends across extended forecast horizons. Notably, PatchTST maintains stable performance using either pretrained or retrained parameters, showcasing superior adaptability and generalization. This makes it particularly suitable for real-world applications where frequent retraining may not be feasible. Overall, the findings validate the applicability of Transformer-based models, particularly PatchTST, in dynamic and precise natural gas production forecasting. This study provides valuable insights for advancing adaptive, data-driven resource management strategies.</div></div>","PeriodicalId":34138,"journal":{"name":"Energy and AI","volume":"21 ","pages":"Article 100535"},"PeriodicalIF":9.6000,"publicationDate":"2025-06-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Energy and AI","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666546825000679","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Accurate prediction of natural gas well production data is crucial for effective resource management and innovation, particularly amid the global transition to sustainable energy. Traditional models struggle with high-frequency, high-dimensional datasets generated by digital transformation in the oil and gas industry. This study explores the application of Transformer-based models — Transformer, Informer, Autoformer, and Patch Time Series Transformer (PatchTST) — for forecasting high-frequency natural gas production data. These models utilize self-attention mechanisms to capture long-term dependencies and efficiently process large-scale datasets. Autoformer achieves predictive success through its Seasonal Decomposition Attention mechanism, which effectively extracts trend-seasonality patterns. However, our experiments show that Autoformer exhibits sensitivity to dataset changes, as performance declines when using old parameters compared to retrained models, highlighting its reliance on dataset-specific retraining. Experimental results demonstrate that increasing sampling frequency significantly enhances prediction accuracy, reducing MAPE from 0.556 to 0.239. Additionally, these models consistently track actual production trends across extended forecast horizons. Notably, PatchTST maintains stable performance using either pretrained or retrained parameters, showcasing superior adaptability and generalization. This makes it particularly suitable for real-world applications where frequent retraining may not be feasible. Overall, the findings validate the applicability of Transformer-based models, particularly PatchTST, in dynamic and precise natural gas production forecasting. This study provides valuable insights for advancing adaptive, data-driven resource management strategies.
准确预测天然气井产量数据对于有效的资源管理和创新至关重要,尤其是在全球向可持续能源转型的背景下。传统模型难以应对油气行业数字化转型产生的高频、高维数据集。本研究探讨了基于变压器的模型(Transformer、Informer、Autoformer和Patch Time Series Transformer (PatchTST))在预测高频天然气产量数据中的应用。这些模型利用自注意机制来捕获长期依赖关系,并有效地处理大规模数据集。Autoformer通过其季节性分解注意机制实现预测成功,该机制有效地提取趋势季节性模式。然而,我们的实验表明,与再训练模型相比,使用旧参数时,Autoformer表现出对数据集变化的敏感性,因为性能下降,突出了它对数据集特定再训练的依赖。实验结果表明,增加采样频率可以显著提高预测精度,将MAPE从0.556降低到0.239。此外,这些模型在扩展的预测范围内始终跟踪实际生产趋势。值得注意的是,PatchTST使用预训练或再训练的参数保持稳定的性能,显示出优越的适应性和泛化。这使得它特别适用于频繁再培训可能不可行的实际应用程序。总的来说,研究结果验证了基于transformer的模型,特别是PatchTST,在动态和精确的天然气产量预测中的适用性。这项研究为推进适应性的、数据驱动的资源管理策略提供了有价值的见解。