Clayton Harper, Luke Wood, Peter Gerstoft, Eric C. Larson
{"title":"利用稀疏傅立叶域学习扩展连续核","authors":"Clayton Harper, Luke Wood, Peter Gerstoft, Eric C. Larson","doi":"arxiv-2409.09875","DOIUrl":null,"url":null,"abstract":"We address three key challenges in learning continuous kernel\nrepresentations: computational efficiency, parameter efficiency, and spectral\nbias. Continuous kernels have shown significant potential, but their practical\nadoption is often limited by high computational and memory demands.\nAdditionally, these methods are prone to spectral bias, which impedes their\nability to capture high-frequency details. To overcome these limitations, we\npropose a novel approach that leverages sparse learning in the Fourier domain.\nOur method enables the efficient scaling of continuous kernels, drastically\nreduces computational and memory requirements, and mitigates spectral bias by\nexploiting the Gibbs phenomenon.","PeriodicalId":501340,"journal":{"name":"arXiv - STAT - Machine Learning","volume":"27 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Scaling Continuous Kernels with Sparse Fourier Domain Learning\",\"authors\":\"Clayton Harper, Luke Wood, Peter Gerstoft, Eric C. Larson\",\"doi\":\"arxiv-2409.09875\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We address three key challenges in learning continuous kernel\\nrepresentations: computational efficiency, parameter efficiency, and spectral\\nbias. Continuous kernels have shown significant potential, but their practical\\nadoption is often limited by high computational and memory demands.\\nAdditionally, these methods are prone to spectral bias, which impedes their\\nability to capture high-frequency details. To overcome these limitations, we\\npropose a novel approach that leverages sparse learning in the Fourier domain.\\nOur method enables the efficient scaling of continuous kernels, drastically\\nreduces computational and memory requirements, and mitigates spectral bias by\\nexploiting the Gibbs phenomenon.\",\"PeriodicalId\":501340,\"journal\":{\"name\":\"arXiv - STAT - Machine Learning\",\"volume\":\"27 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - STAT - Machine Learning\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.09875\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.09875","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Scaling Continuous Kernels with Sparse Fourier Domain Learning
We address three key challenges in learning continuous kernel
representations: computational efficiency, parameter efficiency, and spectral
bias. Continuous kernels have shown significant potential, but their practical
adoption is often limited by high computational and memory demands.
Additionally, these methods are prone to spectral bias, which impedes their
ability to capture high-frequency details. To overcome these limitations, we
propose a novel approach that leverages sparse learning in the Fourier domain.
Our method enables the efficient scaling of continuous kernels, drastically
reduces computational and memory requirements, and mitigates spectral bias by
exploiting the Gibbs phenomenon.