Mapping-to-Parameter nonlinear functional regression with Iterative Local B-spline knot placement

IF 5.5 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Chengdong Shi, Xiao-Jun Zeng, Ching-Hsun Tseng, Wei Zhao
{"title":"Mapping-to-Parameter nonlinear functional regression with Iterative Local B-spline knot placement","authors":"Chengdong Shi,&nbsp;Xiao-Jun Zeng,&nbsp;Ching-Hsun Tseng,&nbsp;Wei Zhao","doi":"10.1016/j.neucom.2025.130403","DOIUrl":null,"url":null,"abstract":"<div><div>Many real-world phenomena are inherently continuous, yet traditionally represented as finite-dimensional vectors or matrices of discrete data points. Functional data analysis offers a natural paradigm by modeling observations as continuous functions, preserving intrinsic continuity and structural dependencies, thereby better capturing real-world dynamics and their underlying truth. However, functional modeling within infinite-dimensional spaces presents significant challenges due to its infinite degrees of freedom and computational complexity. These difficulties have led most studies on functional regression to focus on linear models, with general nonlinear approaches remaining underdeveloped. This paper introduces the Mapping-to-Parameter model, a simple yet effective approach for nonlinear functional regression. The key idea is straightforward: transform nonlinear functional regression problems from infinite-dimensional function spaces to finite-dimensional parameter spaces, where standard machine learning techniques can be readily applied. This transformation is accomplished by uniformly approximating all input or output functions using a common set of B-spline basis functions of any chosen order and representing each function by its vector of basis coefficients. For optimal approximation, we develop a novel Iterative Local Placement algorithm that adaptively distributes knots according to localized function complexity while providing theoretical guarantees on approximation error bounds. The performance of the proposed knot placement algorithm is shown to be robust and efficient in both single-function approximation and multiple-function approximation contexts. Through several real-world applications, the effectiveness and superiority of the Mapping-to-Parameter model are demonstrated in handling both function-on-scalar regression and function-on-function regression problems, consistently outperforming state-of-the-art methods including statistical functional models, neural network models, and functional neural networks.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"644 ","pages":"Article 130403"},"PeriodicalIF":5.5000,"publicationDate":"2025-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225010756","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Many real-world phenomena are inherently continuous, yet traditionally represented as finite-dimensional vectors or matrices of discrete data points. Functional data analysis offers a natural paradigm by modeling observations as continuous functions, preserving intrinsic continuity and structural dependencies, thereby better capturing real-world dynamics and their underlying truth. However, functional modeling within infinite-dimensional spaces presents significant challenges due to its infinite degrees of freedom and computational complexity. These difficulties have led most studies on functional regression to focus on linear models, with general nonlinear approaches remaining underdeveloped. This paper introduces the Mapping-to-Parameter model, a simple yet effective approach for nonlinear functional regression. The key idea is straightforward: transform nonlinear functional regression problems from infinite-dimensional function spaces to finite-dimensional parameter spaces, where standard machine learning techniques can be readily applied. This transformation is accomplished by uniformly approximating all input or output functions using a common set of B-spline basis functions of any chosen order and representing each function by its vector of basis coefficients. For optimal approximation, we develop a novel Iterative Local Placement algorithm that adaptively distributes knots according to localized function complexity while providing theoretical guarantees on approximation error bounds. The performance of the proposed knot placement algorithm is shown to be robust and efficient in both single-function approximation and multiple-function approximation contexts. Through several real-world applications, the effectiveness and superiority of the Mapping-to-Parameter model are demonstrated in handling both function-on-scalar regression and function-on-function regression problems, consistently outperforming state-of-the-art methods including statistical functional models, neural network models, and functional neural networks.
具有迭代局部b样条结点位置的映射到参数非线性泛函回归
许多现实世界的现象本质上是连续的,但传统上表示为有限维向量或离散数据点的矩阵。功能数据分析通过将观测数据建模为连续函数提供了一种自然的范式,保留了内在的连续性和结构依赖性,从而更好地捕捉了现实世界的动态及其潜在的真相。然而,无限维空间中的功能建模由于其无限的自由度和计算复杂性而提出了重大的挑战。这些困难导致大多数函数回归的研究都集中在线性模型上,而一般的非线性方法仍然不发达。本文介绍了一种简单而有效的非线性泛函回归方法-参数映射模型。关键思想很简单:将非线性函数回归问题从无限维函数空间转换为有限维参数空间,在有限维参数空间中可以很容易地应用标准机器学习技术。这种变换是通过使用一组任意阶的b样条基函数统一逼近所有输入或输出函数,并用其基系数向量表示每个函数来完成的。对于最优逼近,我们开发了一种新颖的迭代局部放置算法,该算法根据局部函数复杂度自适应分布结点,同时提供了近似误差界的理论保证。结果表明,该算法在单函数逼近和多函数逼近环境下都具有鲁棒性和有效性。通过几个实际应用,映射到参数模型在处理函数对标量回归和函数对函数回归问题方面的有效性和优越性得到了证明,它始终优于最先进的方法,包括统计函数模型、神经网络模型和功能神经网络。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Neurocomputing
Neurocomputing 工程技术-计算机:人工智能
CiteScore
13.10
自引率
10.00%
发文量
1382
审稿时长
70 days
期刊介绍: Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信