学习半空间的复杂性理论限制

Amit Daniely
{"title":"学习半空间的复杂性理论限制","authors":"Amit Daniely","doi":"10.1145/2897518.2897520","DOIUrl":null,"url":null,"abstract":"We study the problem of agnostically learning halfspaces which is defined by a fixed but unknown distribution D on Q^n X {-1,1}. We define Err_H(D) as the least error of a halfspace classifier for D. A learner who can access D has to return a hypothesis whose error is small compared to Err_H(D). Using the recently developed method of Daniely, Linial and Shalev-Shwartz we prove hardness of learning results assuming that random K-XOR formulas are hard to (strongly) refute. We show that no efficient learning algorithm has non-trivial worst-case performance even under the guarantees that Err_H(D) <= eta for arbitrarily small constant eta>0, and that D is supported in the Boolean cube. Namely, even under these favorable conditions, and for every c>0, it is hard to return a hypothesis with error <= 1/2-n^{-c}. In particular, no efficient algorithm can achieve a constant approximation ratio. Under a stronger version of the assumption (where K can be poly-logarithmic in n), we can take eta = 2^{-log^{1-nu}(n)} for arbitrarily small nu>0. These results substantially improve on previously known results, that only show hardness of exact learning.","PeriodicalId":442965,"journal":{"name":"Proceedings of the forty-eighth annual ACM symposium on Theory of Computing","volume":"196 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"119","resultStr":"{\"title\":\"Complexity theoretic limitations on learning halfspaces\",\"authors\":\"Amit Daniely\",\"doi\":\"10.1145/2897518.2897520\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We study the problem of agnostically learning halfspaces which is defined by a fixed but unknown distribution D on Q^n X {-1,1}. We define Err_H(D) as the least error of a halfspace classifier for D. A learner who can access D has to return a hypothesis whose error is small compared to Err_H(D). Using the recently developed method of Daniely, Linial and Shalev-Shwartz we prove hardness of learning results assuming that random K-XOR formulas are hard to (strongly) refute. We show that no efficient learning algorithm has non-trivial worst-case performance even under the guarantees that Err_H(D) <= eta for arbitrarily small constant eta>0, and that D is supported in the Boolean cube. Namely, even under these favorable conditions, and for every c>0, it is hard to return a hypothesis with error <= 1/2-n^{-c}. In particular, no efficient algorithm can achieve a constant approximation ratio. Under a stronger version of the assumption (where K can be poly-logarithmic in n), we can take eta = 2^{-log^{1-nu}(n)} for arbitrarily small nu>0. These results substantially improve on previously known results, that only show hardness of exact learning.\",\"PeriodicalId\":442965,\"journal\":{\"name\":\"Proceedings of the forty-eighth annual ACM symposium on Theory of Computing\",\"volume\":\"196 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-05-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"119\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the forty-eighth annual ACM symposium on Theory of Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2897518.2897520\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the forty-eighth annual ACM symposium on Theory of Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2897518.2897520","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 119

摘要

研究由Q^n X{-1,1}上的一个固定但未知的分布D所定义的不可知论学习半空间问题。我们将Err_H(D)定义为D的半空间分类器的最小误差。能够访问D的学习器必须返回一个误差小于Err_H(D)的假设。使用Daniely, Linial和shalev - schwartz最近开发的方法,我们证明了学习结果的硬度,假设随机K-XOR公式难以(强烈)反驳。我们证明,即使保证Err_H(D)为0,并且在布尔立方体中支持D,也没有有效的学习算法具有非平凡的最坏情况性能。也就是说,即使在这些有利条件下,对于每c>0,也很难返回误差为0的假设。这些结果大大改善了之前已知的结果,这些结果只显示了精确学习的难度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Complexity theoretic limitations on learning halfspaces
We study the problem of agnostically learning halfspaces which is defined by a fixed but unknown distribution D on Q^n X {-1,1}. We define Err_H(D) as the least error of a halfspace classifier for D. A learner who can access D has to return a hypothesis whose error is small compared to Err_H(D). Using the recently developed method of Daniely, Linial and Shalev-Shwartz we prove hardness of learning results assuming that random K-XOR formulas are hard to (strongly) refute. We show that no efficient learning algorithm has non-trivial worst-case performance even under the guarantees that Err_H(D) <= eta for arbitrarily small constant eta>0, and that D is supported in the Boolean cube. Namely, even under these favorable conditions, and for every c>0, it is hard to return a hypothesis with error <= 1/2-n^{-c}. In particular, no efficient algorithm can achieve a constant approximation ratio. Under a stronger version of the assumption (where K can be poly-logarithmic in n), we can take eta = 2^{-log^{1-nu}(n)} for arbitrarily small nu>0. These results substantially improve on previously known results, that only show hardness of exact learning.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信