{"title":"基于变压器的时间序列自监督预训练模型","authors":"Zhengrong Sun, Junhai Zhai, Yang Cao, Feng Zhang","doi":"10.1016/j.asoc.2025.113491","DOIUrl":null,"url":null,"abstract":"<div><div>Multivariate time series forecasting is ubiquitous in the real world. The performance of prediction model is determined by its representation ability. At present, self-supervised pre-training is the main method to improve the representation ability of prediction models. However, the periodic characteristics of time series are rarely considered in the existing pre-training models. Our experimental study shows that the periodic characteristics of time series have a great impact on the performance of self-supervised pre-training models. To address this issue, we propose a novel self-supervised prediction model, SMformer. SMformer has two distinctive features: (1) A new patch partition Module is innovatively introduced into backbone model transformer using the periodic property of time series. (2) Two pretext tasks, shuffle and mask, are design for the self-supervised pre-training of the model SMformer. We conducted extensive experiments on seven benchmark datasets, and the experimental results demonstrate that SMformer significantly outperforms prior comparison baselines.</div></div>","PeriodicalId":50737,"journal":{"name":"Applied Soft Computing","volume":"181 ","pages":"Article 113491"},"PeriodicalIF":6.6000,"publicationDate":"2025-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A transformer-based self-supervised pre-training model for time series prediction\",\"authors\":\"Zhengrong Sun, Junhai Zhai, Yang Cao, Feng Zhang\",\"doi\":\"10.1016/j.asoc.2025.113491\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Multivariate time series forecasting is ubiquitous in the real world. The performance of prediction model is determined by its representation ability. At present, self-supervised pre-training is the main method to improve the representation ability of prediction models. However, the periodic characteristics of time series are rarely considered in the existing pre-training models. Our experimental study shows that the periodic characteristics of time series have a great impact on the performance of self-supervised pre-training models. To address this issue, we propose a novel self-supervised prediction model, SMformer. SMformer has two distinctive features: (1) A new patch partition Module is innovatively introduced into backbone model transformer using the periodic property of time series. (2) Two pretext tasks, shuffle and mask, are design for the self-supervised pre-training of the model SMformer. We conducted extensive experiments on seven benchmark datasets, and the experimental results demonstrate that SMformer significantly outperforms prior comparison baselines.</div></div>\",\"PeriodicalId\":50737,\"journal\":{\"name\":\"Applied Soft Computing\",\"volume\":\"181 \",\"pages\":\"Article 113491\"},\"PeriodicalIF\":6.6000,\"publicationDate\":\"2025-06-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied Soft Computing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1568494625008026\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Soft Computing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1568494625008026","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
A transformer-based self-supervised pre-training model for time series prediction
Multivariate time series forecasting is ubiquitous in the real world. The performance of prediction model is determined by its representation ability. At present, self-supervised pre-training is the main method to improve the representation ability of prediction models. However, the periodic characteristics of time series are rarely considered in the existing pre-training models. Our experimental study shows that the periodic characteristics of time series have a great impact on the performance of self-supervised pre-training models. To address this issue, we propose a novel self-supervised prediction model, SMformer. SMformer has two distinctive features: (1) A new patch partition Module is innovatively introduced into backbone model transformer using the periodic property of time series. (2) Two pretext tasks, shuffle and mask, are design for the self-supervised pre-training of the model SMformer. We conducted extensive experiments on seven benchmark datasets, and the experimental results demonstrate that SMformer significantly outperforms prior comparison baselines.
期刊介绍:
Applied Soft Computing is an international journal promoting an integrated view of soft computing to solve real life problems.The focus is to publish the highest quality research in application and convergence of the areas of Fuzzy Logic, Neural Networks, Evolutionary Computing, Rough Sets and other similar techniques to address real world complexities.
Applied Soft Computing is a rolling publication: articles are published as soon as the editor-in-chief has accepted them. Therefore, the web site will continuously be updated with new articles and the publication time will be short.