Expressibility-induced Concentration of Quantum Neural Tangent Kernels.

Li-Wei Yu, Weikang Li, Qi Ye, Zhide Lu, Zizhao Han, Dong-Ling Deng
{"title":"Expressibility-induced Concentration of Quantum Neural Tangent Kernels.","authors":"Li-Wei Yu, Weikang Li, Qi Ye, Zhide Lu, Zizhao Han, Dong-Ling Deng","doi":"10.1088/1361-6633/ad82cf","DOIUrl":null,"url":null,"abstract":"<p><p>Quantum tangent kernel methods provide an efficient approach to analyzing the performance of quantum machine learning models in the infinite-width limit, which is of crucial importance in designing appropriate circuit architectures for certain learning tasks. Recently, they have been adapted to describe the convergence rate of training errors in quantum neural networks in an analytical manner. Here, we study the connections between the expressibility and value concentration of quantum tangent kernel models. In particular, for global loss functions, we rigorously prove that high expressibility of both the global and local quantum encodings can lead to exponential concentration of quantum tangent kernel values to zero. Whereas for local loss functions, such issue of exponential concentration persists owing to the high expressibility, but can be partially mitigated. We further carry out extensive numerical simulations to support our analytical theories. Our discoveries unveil a fundamental feature of quantum neural tangent kernels, indicating that the issue of their concentration cannot be bypassed merely by transitioning to a local encoding scheme while maintaining high expressibility. This offers valuable insights for the design of wide quantum variational circuit models in practical applications.</p>","PeriodicalId":74666,"journal":{"name":"Reports on progress in physics. Physical Society (Great Britain)","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Reports on progress in physics. Physical Society (Great Britain)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/1361-6633/ad82cf","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Quantum tangent kernel methods provide an efficient approach to analyzing the performance of quantum machine learning models in the infinite-width limit, which is of crucial importance in designing appropriate circuit architectures for certain learning tasks. Recently, they have been adapted to describe the convergence rate of training errors in quantum neural networks in an analytical manner. Here, we study the connections between the expressibility and value concentration of quantum tangent kernel models. In particular, for global loss functions, we rigorously prove that high expressibility of both the global and local quantum encodings can lead to exponential concentration of quantum tangent kernel values to zero. Whereas for local loss functions, such issue of exponential concentration persists owing to the high expressibility, but can be partially mitigated. We further carry out extensive numerical simulations to support our analytical theories. Our discoveries unveil a fundamental feature of quantum neural tangent kernels, indicating that the issue of their concentration cannot be bypassed merely by transitioning to a local encoding scheme while maintaining high expressibility. This offers valuable insights for the design of wide quantum variational circuit models in practical applications.

量子神经切线核的可表达性诱导浓度。
量子切核方法为分析量子机器学习模型在无限宽极限下的性能提供了一种有效方法,这对于为某些学习任务设计适当的电路架构至关重要。最近,这些方法被用于分析描述量子神经网络中训练误差的收敛速率。在这里,我们研究了量子切核模型的可表达性和值集中之间的联系。特别是,对于全局损失函数,我们严格证明了全局和局部量子编码的高可表达性会导致量子正切核值以指数形式集中为零。而对于局部损失函数,由于可表达性高,指数集中的问题依然存在,但可以得到部分缓解。我们进一步进行了大量的数值模拟,以支持我们的分析理论。我们的发现揭示了量子神经切核的一个基本特征,表明在保持高可表达性的同时,仅仅过渡到局部编码方案无法绕过其集中问题。这为在实际应用中设计宽量子变分电路模型提供了宝贵的启示。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信