{"title":"超越 LoRA:探索时间序列基础模型的高效微调技术","authors":"Divij Gupta, Anubhav Bhatti, Surajsinh Parmar","doi":"arxiv-2409.11302","DOIUrl":null,"url":null,"abstract":"Time Series Foundation Models (TSFMs) have recently garnered attention for\ntheir ability to model complex, large-scale time series data across domains\nsuch as retail, finance, and transportation. However, their application to\nsensitive, domain-specific fields like healthcare remains challenging,\nprimarily due to the difficulty of fine-tuning these models for specialized,\nout-of-domain tasks with scarce publicly available datasets. In this work, we\nexplore the use of Parameter-Efficient Fine-Tuning (PEFT) techniques to address\nthese limitations, focusing on healthcare applications, particularly ICU vitals\nforecasting for sepsis patients. We introduce and evaluate two selective\n(BitFit and LayerNorm Tuning) and two additive (VeRA and FourierFT) PEFT\ntechniques on multiple configurations of the Chronos TSFM for forecasting vital\nsigns of sepsis patients. Our comparative analysis demonstrates that some of\nthese PEFT methods outperform LoRA in terms of parameter efficiency and domain\nadaptation, establishing state-of-the-art (SOTA) results in ICU vital\nforecasting tasks. Interestingly, FourierFT applied to the Chronos (Tiny)\nvariant surpasses the SOTA model while fine-tuning only 2,400 parameters\ncompared to the 700K parameters of the benchmark.","PeriodicalId":501301,"journal":{"name":"arXiv - CS - Machine Learning","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Beyond LoRA: Exploring Efficient Fine-Tuning Techniques for Time Series Foundational Models\",\"authors\":\"Divij Gupta, Anubhav Bhatti, Surajsinh Parmar\",\"doi\":\"arxiv-2409.11302\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Time Series Foundation Models (TSFMs) have recently garnered attention for\\ntheir ability to model complex, large-scale time series data across domains\\nsuch as retail, finance, and transportation. However, their application to\\nsensitive, domain-specific fields like healthcare remains challenging,\\nprimarily due to the difficulty of fine-tuning these models for specialized,\\nout-of-domain tasks with scarce publicly available datasets. In this work, we\\nexplore the use of Parameter-Efficient Fine-Tuning (PEFT) techniques to address\\nthese limitations, focusing on healthcare applications, particularly ICU vitals\\nforecasting for sepsis patients. We introduce and evaluate two selective\\n(BitFit and LayerNorm Tuning) and two additive (VeRA and FourierFT) PEFT\\ntechniques on multiple configurations of the Chronos TSFM for forecasting vital\\nsigns of sepsis patients. Our comparative analysis demonstrates that some of\\nthese PEFT methods outperform LoRA in terms of parameter efficiency and domain\\nadaptation, establishing state-of-the-art (SOTA) results in ICU vital\\nforecasting tasks. Interestingly, FourierFT applied to the Chronos (Tiny)\\nvariant surpasses the SOTA model while fine-tuning only 2,400 parameters\\ncompared to the 700K parameters of the benchmark.\",\"PeriodicalId\":501301,\"journal\":{\"name\":\"arXiv - CS - Machine Learning\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Machine Learning\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.11302\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11302","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Beyond LoRA: Exploring Efficient Fine-Tuning Techniques for Time Series Foundational Models
Time Series Foundation Models (TSFMs) have recently garnered attention for
their ability to model complex, large-scale time series data across domains
such as retail, finance, and transportation. However, their application to
sensitive, domain-specific fields like healthcare remains challenging,
primarily due to the difficulty of fine-tuning these models for specialized,
out-of-domain tasks with scarce publicly available datasets. In this work, we
explore the use of Parameter-Efficient Fine-Tuning (PEFT) techniques to address
these limitations, focusing on healthcare applications, particularly ICU vitals
forecasting for sepsis patients. We introduce and evaluate two selective
(BitFit and LayerNorm Tuning) and two additive (VeRA and FourierFT) PEFT
techniques on multiple configurations of the Chronos TSFM for forecasting vital
signs of sepsis patients. Our comparative analysis demonstrates that some of
these PEFT methods outperform LoRA in terms of parameter efficiency and domain
adaptation, establishing state-of-the-art (SOTA) results in ICU vital
forecasting tasks. Interestingly, FourierFT applied to the Chronos (Tiny)
variant surpasses the SOTA model while fine-tuning only 2,400 parameters
compared to the 700K parameters of the benchmark.