{"title":"Fat shattering, joint measurability, and PAC learnability of POVM hypothesis classes","authors":"Abram Magner, Arun Padakandla","doi":"10.1007/s11128-024-04555-y","DOIUrl":null,"url":null,"abstract":"<div><p>We characterize learnability for quantum measurement classes by establishing matching necessary and sufficient conditions for their probably approximately correct (PAC) learnability, along with corresponding sample complexity bounds, in the setting where the learner is given access only to prepared quantum states. We first show that the empirical risk minimization (ERM) rule proposed in previous work is not universal, nor does uniform convergence of the empirical risk characterize learnability. Moreover, we show that VC dimension generalization bounds in previous work are in many cases infinite, even for measurement classes defined on a finite-dimensional Hilbert space and even for learnable classes. To surmount the failure of the standard ERM to satisfy uniform convergence, we define a new learning rule—<i>denoised empirical risk minimization</i>. We show this to be a universal learning rule for both classical probabilistically observed concept classes and quantum measurement classes, and the condition for it to satisfy uniform convergence is finite fat shattering dimension of the class. The fat shattering dimension of a hypothesis class is a measure of complexity that intervenes in sample complexity bounds for regression in classical learning theory. We give sample complexity upper and lower bounds for learnability in terms of finite fat shattering dimension and approximate finite partitionability into approximately jointly measurable subsets. We link fat shattering dimension with partitionability into approximately jointly measurable subsets, leading to our matching conditions. We also show that every measurement class defined on a finite-dimensional Hilbert space is PAC learnable. We illustrate our results on several example POVM classes.</p></div>","PeriodicalId":746,"journal":{"name":"Quantum Information Processing","volume":"23 10","pages":""},"PeriodicalIF":2.2000,"publicationDate":"2024-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Quantum Information Processing","FirstCategoryId":"101","ListUrlMain":"https://link.springer.com/article/10.1007/s11128-024-04555-y","RegionNum":3,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PHYSICS, MATHEMATICAL","Score":null,"Total":0}
引用次数: 0
Abstract
We characterize learnability for quantum measurement classes by establishing matching necessary and sufficient conditions for their probably approximately correct (PAC) learnability, along with corresponding sample complexity bounds, in the setting where the learner is given access only to prepared quantum states. We first show that the empirical risk minimization (ERM) rule proposed in previous work is not universal, nor does uniform convergence of the empirical risk characterize learnability. Moreover, we show that VC dimension generalization bounds in previous work are in many cases infinite, even for measurement classes defined on a finite-dimensional Hilbert space and even for learnable classes. To surmount the failure of the standard ERM to satisfy uniform convergence, we define a new learning rule—denoised empirical risk minimization. We show this to be a universal learning rule for both classical probabilistically observed concept classes and quantum measurement classes, and the condition for it to satisfy uniform convergence is finite fat shattering dimension of the class. The fat shattering dimension of a hypothesis class is a measure of complexity that intervenes in sample complexity bounds for regression in classical learning theory. We give sample complexity upper and lower bounds for learnability in terms of finite fat shattering dimension and approximate finite partitionability into approximately jointly measurable subsets. We link fat shattering dimension with partitionability into approximately jointly measurable subsets, leading to our matching conditions. We also show that every measurement class defined on a finite-dimensional Hilbert space is PAC learnable. We illustrate our results on several example POVM classes.
期刊介绍:
Quantum Information Processing is a high-impact, international journal publishing cutting-edge experimental and theoretical research in all areas of Quantum Information Science. Topics of interest include quantum cryptography and communications, entanglement and discord, quantum algorithms, quantum error correction and fault tolerance, quantum computer science, quantum imaging and sensing, and experimental platforms for quantum information. Quantum Information Processing supports and inspires research by providing a comprehensive peer review process, and broadcasting high quality results in a range of formats. These include original papers, letters, broadly focused perspectives, comprehensive review articles, book reviews, and special topical issues. The journal is particularly interested in papers detailing and demonstrating quantum information protocols for cryptography, communications, computation, and sensing.