Consistency of Fractional Graph-Laplacian Regularization in Semisupervised Learning with Finite Labels

IF 2.2 2区 数学 Q1 MATHEMATICS, APPLIED
Adrien Weihs, Matthew Thorpe
{"title":"Consistency of Fractional Graph-Laplacian Regularization in Semisupervised Learning with Finite Labels","authors":"Adrien Weihs, Matthew Thorpe","doi":"10.1137/23m1559087","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Mathematical Analysis, Volume 56, Issue 4, Page 4253-4295, August 2024. <br/> Abstract. Laplace learning is a popular machine learning algorithm for finding missing labels from a small number of labeled feature vectors using the geometry of a graph. More precisely, Laplace learning is based on minimizing a graph-Dirichlet energy, equivalently a discrete Sobolev [math] seminorm, constrained to taking the values of known labels on a given subset. The variational problem is asymptotically ill-posed as the number of unlabeled feature vectors goes to infinity for finite given labels due to a lack of regularity in minimizers of the continuum Dirichlet energy in any dimension higher than one. In particular, continuum minimizers are not continuous. One solution is to consider higher-order regularization, which is the analogue of minimizing Sobolev [math] seminorms. In this paper we consider the asymptotics of minimizing a graph variant of the Sobolev [math] seminorm with pointwise constraints. We show that, as expected, one needs [math], where [math] is the dimension of the data manifold. We also show that there must be an upper bound on the connectivity of the graph; that is, highly connected graphs lead to degenerate behavior of the minimizer even when [math].","PeriodicalId":51150,"journal":{"name":"SIAM Journal on Mathematical Analysis","volume":null,"pages":null},"PeriodicalIF":2.2000,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Mathematical Analysis","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/23m1559087","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0

Abstract

SIAM Journal on Mathematical Analysis, Volume 56, Issue 4, Page 4253-4295, August 2024.
Abstract. Laplace learning is a popular machine learning algorithm for finding missing labels from a small number of labeled feature vectors using the geometry of a graph. More precisely, Laplace learning is based on minimizing a graph-Dirichlet energy, equivalently a discrete Sobolev [math] seminorm, constrained to taking the values of known labels on a given subset. The variational problem is asymptotically ill-posed as the number of unlabeled feature vectors goes to infinity for finite given labels due to a lack of regularity in minimizers of the continuum Dirichlet energy in any dimension higher than one. In particular, continuum minimizers are not continuous. One solution is to consider higher-order regularization, which is the analogue of minimizing Sobolev [math] seminorms. In this paper we consider the asymptotics of minimizing a graph variant of the Sobolev [math] seminorm with pointwise constraints. We show that, as expected, one needs [math], where [math] is the dimension of the data manifold. We also show that there must be an upper bound on the connectivity of the graph; that is, highly connected graphs lead to degenerate behavior of the minimizer even when [math].
有限标签半监督学习中分数图-拉普拉斯正则化的一致性
SIAM 数学分析期刊》,第 56 卷第 4 期,第 4253-4295 页,2024 年 8 月。 摘要拉普拉斯学习(Laplace learning)是一种流行的机器学习算法,它利用图的几何特性从少量已标注特征向量中寻找缺失标签。更确切地说,拉普拉斯学习是基于最小化图-Dirichlet 能量(等同于离散 Sobolev [math]半式),并受限于取给定子集上已知标签的值。对于有限的给定标签,当未标注特征向量的数量达到无穷大时,变分问题就会渐近失常,这是因为在任何高于一维的连续体 Dirichlet 能量最小化中都缺乏规则性。特别是,连续最小值不是连续的。解决方法之一是考虑高阶正则化,即 Sobolev [math] Semorms 的最小化。在本文中,我们考虑了带点约束的 Sobolev [math] seminorm 的图变体最小化的渐近线。我们证明,正如预期的那样,我们需要 [math],其中 [math] 是数据流形的维度。我们还证明,图的连通性必须有一个上限;也就是说,高度连通的图即使在 [math] 时也会导致最小化的退化行为。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
3.30
自引率
5.00%
发文量
175
审稿时长
12 months
期刊介绍: SIAM Journal on Mathematical Analysis (SIMA) features research articles of the highest quality employing innovative analytical techniques to treat problems in the natural sciences. Every paper has content that is primarily analytical and that employs mathematical methods in such areas as partial differential equations, the calculus of variations, functional analysis, approximation theory, harmonic or wavelet analysis, or dynamical systems. Additionally, every paper relates to a model for natural phenomena in such areas as fluid mechanics, materials science, quantum mechanics, biology, mathematical physics, or to the computational analysis of such phenomena. Submission of a manuscript to a SIAM journal is representation by the author that the manuscript has not been published or submitted simultaneously for publication elsewhere. Typical papers for SIMA do not exceed 35 journal pages. Substantial deviations from this page limit require that the referees, editor, and editor-in-chief be convinced that the increased length is both required by the subject matter and justified by the quality of the paper.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信