Monotonic learning in the PAC framework: A new perspective

IF 7.6 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Ming Li , Chenyi Zhang , Qin Li
{"title":"Monotonic learning in the PAC framework: A new perspective","authors":"Ming Li ,&nbsp;Chenyi Zhang ,&nbsp;Qin Li","doi":"10.1016/j.knosys.2025.114504","DOIUrl":null,"url":null,"abstract":"<div><div>Monotone learning describes learning processes in which expected error consistently decreases as the amount of training data increases. However, recent studies challenge this conventional wisdom, revealing significant gaps in the understanding of generalization in machine learning. Addressing these gaps is crucial for advancing the theoretical foundations of the field. In this work, we utilize Probably Approximately Correct (PAC) learning theory to construct a theoretical error distribution that approximates a learning algorithm’s actual performance. We rigorously prove that this theoretical distribution exhibits monotonicity as sample sizes increase. We identify two scenarios under which deterministic algorithms based on Empirical Risk Minimization (ERM) are monotone: (1) the hypothesis space is finite, or (2) the hypothesis space has finite VC-dimension. Experiments on three classical learning problems validate our findings by demonstrating that the monotonicity of the algorithms’ generalization error is guaranteed, as its theoretical error upper bound monotonically converges to the minimum generalization error.</div></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":"330 ","pages":"Article 114504"},"PeriodicalIF":7.6000,"publicationDate":"2025-09-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950705125015436","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Monotone learning describes learning processes in which expected error consistently decreases as the amount of training data increases. However, recent studies challenge this conventional wisdom, revealing significant gaps in the understanding of generalization in machine learning. Addressing these gaps is crucial for advancing the theoretical foundations of the field. In this work, we utilize Probably Approximately Correct (PAC) learning theory to construct a theoretical error distribution that approximates a learning algorithm’s actual performance. We rigorously prove that this theoretical distribution exhibits monotonicity as sample sizes increase. We identify two scenarios under which deterministic algorithms based on Empirical Risk Minimization (ERM) are monotone: (1) the hypothesis space is finite, or (2) the hypothesis space has finite VC-dimension. Experiments on three classical learning problems validate our findings by demonstrating that the monotonicity of the algorithms’ generalization error is guaranteed, as its theoretical error upper bound monotonically converges to the minimum generalization error.
PAC框架下的单调学习:一个新的视角
单调学习描述的是随着训练数据量的增加,预期误差持续降低的学习过程。然而,最近的研究挑战了这一传统智慧,揭示了对机器学习泛化理解的重大差距。解决这些差距对于推进该领域的理论基础至关重要。在这项工作中,我们利用可能近似正确(PAC)学习理论来构建一个接近学习算法实际性能的理论误差分布。我们严格地证明了该理论分布随着样本量的增加呈现单调性。我们确定了基于经验风险最小化(ERM)的确定性算法单调的两种情况:(1)假设空间是有限的,或(2)假设空间具有有限的vc维。在三个经典学习问题上的实验验证了我们的发现,证明了算法的泛化误差的单调性是有保证的,因为它的理论误差上界单调收敛于最小泛化误差。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Knowledge-Based Systems
Knowledge-Based Systems 工程技术-计算机:人工智能
CiteScore
14.80
自引率
12.50%
发文量
1245
审稿时长
7.8 months
期刊介绍: Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信