{"title":"基于新型推理基础模型的少镜头学习的准确预测","authors":"Peng-Cheng Li, Yan-Wu Wang, Jiang-Wen Xiao","doi":"10.1016/j.inffus.2025.103370","DOIUrl":null,"url":null,"abstract":"<div><div>Accurate forecasting with limited data remains a significant challenge, especially for deep learning models that require large-scale training data to map historical data to future data. While Meta-learning (MeL) and Transfer Learning (TL) are useful, they have limitations: MeL assumes shared task structures, which may not apply to unique tasks, and TL requires domain similarity, often failing when distributions differ. Importantly, this paper reveals that future trend changes are often embedded in historical data, regardless of dataset size. However, deep learning models struggle to learn these trends from small training datasets due to their reliance on extensive historical information for mapping past to future. To address this gap, a novel inference foundation model is designed to uncover intrinsic change patterns within the data rather than relying on extensive historical information. Inspired by gene evolution, our approach decomposes historical data into subsequences (genes), selects optimal genes, and combines them into evolutionary chains based on temporal relationships. Each chain represents potential future trends. Through five generations of selection and recombination, the best gene sequence is identified for forecasting. The proposed model outperforms all state-of-the-art models across three experiments involving eight datasets. Specifically, it achieves a 27% improvement over the best-performing MeL-based and TL-based models. Furthermore, it shows an average improvement of 38% over other leading models, including Transformer-based, Multiscale-based, Linear-based, MLP-based, and Convolution-based models.</div></div>","PeriodicalId":50367,"journal":{"name":"Information Fusion","volume":"124 ","pages":"Article 103370"},"PeriodicalIF":15.5000,"publicationDate":"2025-06-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Accurate forecasting on few-shot learning with a novel inference foundation model\",\"authors\":\"Peng-Cheng Li, Yan-Wu Wang, Jiang-Wen Xiao\",\"doi\":\"10.1016/j.inffus.2025.103370\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Accurate forecasting with limited data remains a significant challenge, especially for deep learning models that require large-scale training data to map historical data to future data. While Meta-learning (MeL) and Transfer Learning (TL) are useful, they have limitations: MeL assumes shared task structures, which may not apply to unique tasks, and TL requires domain similarity, often failing when distributions differ. Importantly, this paper reveals that future trend changes are often embedded in historical data, regardless of dataset size. However, deep learning models struggle to learn these trends from small training datasets due to their reliance on extensive historical information for mapping past to future. To address this gap, a novel inference foundation model is designed to uncover intrinsic change patterns within the data rather than relying on extensive historical information. Inspired by gene evolution, our approach decomposes historical data into subsequences (genes), selects optimal genes, and combines them into evolutionary chains based on temporal relationships. Each chain represents potential future trends. Through five generations of selection and recombination, the best gene sequence is identified for forecasting. The proposed model outperforms all state-of-the-art models across three experiments involving eight datasets. Specifically, it achieves a 27% improvement over the best-performing MeL-based and TL-based models. Furthermore, it shows an average improvement of 38% over other leading models, including Transformer-based, Multiscale-based, Linear-based, MLP-based, and Convolution-based models.</div></div>\",\"PeriodicalId\":50367,\"journal\":{\"name\":\"Information Fusion\",\"volume\":\"124 \",\"pages\":\"Article 103370\"},\"PeriodicalIF\":15.5000,\"publicationDate\":\"2025-06-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Information Fusion\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1566253525004439\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Fusion","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1566253525004439","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Accurate forecasting on few-shot learning with a novel inference foundation model
Accurate forecasting with limited data remains a significant challenge, especially for deep learning models that require large-scale training data to map historical data to future data. While Meta-learning (MeL) and Transfer Learning (TL) are useful, they have limitations: MeL assumes shared task structures, which may not apply to unique tasks, and TL requires domain similarity, often failing when distributions differ. Importantly, this paper reveals that future trend changes are often embedded in historical data, regardless of dataset size. However, deep learning models struggle to learn these trends from small training datasets due to their reliance on extensive historical information for mapping past to future. To address this gap, a novel inference foundation model is designed to uncover intrinsic change patterns within the data rather than relying on extensive historical information. Inspired by gene evolution, our approach decomposes historical data into subsequences (genes), selects optimal genes, and combines them into evolutionary chains based on temporal relationships. Each chain represents potential future trends. Through five generations of selection and recombination, the best gene sequence is identified for forecasting. The proposed model outperforms all state-of-the-art models across three experiments involving eight datasets. Specifically, it achieves a 27% improvement over the best-performing MeL-based and TL-based models. Furthermore, it shows an average improvement of 38% over other leading models, including Transformer-based, Multiscale-based, Linear-based, MLP-based, and Convolution-based models.
期刊介绍:
Information Fusion serves as a central platform for showcasing advancements in multi-sensor, multi-source, multi-process information fusion, fostering collaboration among diverse disciplines driving its progress. It is the leading outlet for sharing research and development in this field, focusing on architectures, algorithms, and applications. Papers dealing with fundamental theoretical analyses as well as those demonstrating their application to real-world problems will be welcome.