Journal of Complexity最新文献

筛选
英文 中文
On Huber's contaminated model 关于Huber污染模型
IF 1.7 2区 数学
Journal of Complexity Pub Date : 2023-08-01 DOI: 10.1016/j.jco.2023.101745
Weiyan Mu , Shifeng Xiong
{"title":"On Huber's contaminated model","authors":"Weiyan Mu ,&nbsp;Shifeng Xiong","doi":"10.1016/j.jco.2023.101745","DOIUrl":"https://doi.org/10.1016/j.jco.2023.101745","url":null,"abstract":"<div><p><span><span>Huber's contaminated model is a basic model for data with outliers. This paper aims at addressing several fundamental problems about this model. We first study its identifiability properties. Several theorems are presented to determine whether the model is identifiable for various situations. Based on these results, we discuss the problem of estimating the parameters with observations drawn from Huber's contaminated model. A definition of estimation consistency is introduced to handle the general case where the model may be unidentifiable. This consistency is a strong </span>robustness property. After showing that existing estimators cannot be consistent in this sense, we propose a new estimator that possesses the consistency property under mild conditions. Its adaptive version, which can simultaneously possess this consistency property and optimal </span>asymptotic efficiency, is also provided. Numerical examples show that our estimators have better overall performance than existing estimators no matter how many outliers in the data.</p></div>","PeriodicalId":50227,"journal":{"name":"Journal of Complexity","volume":"77 ","pages":"Article 101745"},"PeriodicalIF":1.7,"publicationDate":"2023-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50200306","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A continuous characterization of PSPACE using polynomial ordinary differential equations 用多项式常微分方程连续刻画PSPACE
IF 1.7 2区 数学
Journal of Complexity Pub Date : 2023-08-01 DOI: 10.1016/j.jco.2023.101755
Olivier Bournez , Riccardo Gozzi , Daniel S. Graça , Amaury Pouly
{"title":"A continuous characterization of PSPACE using polynomial ordinary differential equations","authors":"Olivier Bournez ,&nbsp;Riccardo Gozzi ,&nbsp;Daniel S. Graça ,&nbsp;Amaury Pouly","doi":"10.1016/j.jco.2023.101755","DOIUrl":"https://doi.org/10.1016/j.jco.2023.101755","url":null,"abstract":"<div><p>In this paper we provide a characterization of the complexity class PSPACE by using a purely continuous model defined with polynomial ordinary differential equations.</p></div>","PeriodicalId":50227,"journal":{"name":"Journal of Complexity","volume":"77 ","pages":"Article 101755"},"PeriodicalIF":1.7,"publicationDate":"2023-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50200238","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Dmitriy Bilyk and Feng Dai are the winners of the 2023 Joseph F. Traub Prize for Achievement in Information-Based Complexity 德米特里·比利克和冯戴是2023年约瑟夫·F·特劳布信息复杂性成就奖的得主
IF 1.7 2区 数学
Journal of Complexity Pub Date : 2023-08-01 DOI: 10.1016/j.jco.2023.101756
Erich Novak
{"title":"Dmitriy Bilyk and Feng Dai are the winners of the 2023 Joseph F. Traub Prize for Achievement in Information-Based Complexity","authors":"Erich Novak","doi":"10.1016/j.jco.2023.101756","DOIUrl":"https://doi.org/10.1016/j.jco.2023.101756","url":null,"abstract":"","PeriodicalId":50227,"journal":{"name":"Journal of Complexity","volume":"77 ","pages":"Article 101756"},"PeriodicalIF":1.7,"publicationDate":"2023-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50200239","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Rates of approximation by ReLU shallow neural networks ReLU浅层神经网络的近似速率
IF 1.7 2区 数学
Journal of Complexity Pub Date : 2023-07-31 DOI: 10.1016/j.jco.2023.101784
Tong Mao , Ding-Xuan Zhou
{"title":"Rates of approximation by ReLU shallow neural networks","authors":"Tong Mao ,&nbsp;Ding-Xuan Zhou","doi":"10.1016/j.jco.2023.101784","DOIUrl":"https://doi.org/10.1016/j.jco.2023.101784","url":null,"abstract":"<div><p>Neural networks activated by the rectified linear unit (ReLU) play a central role in the recent development of deep learning. The topic of approximating functions from Hölder spaces by these networks is crucial for understanding the efficiency of the induced learning algorithms. Although the topic has been well investigated in the setting of deep neural networks with many layers of hidden neurons, it is still open for shallow networks having only one hidden layer. In this paper, we provide rates of uniform approximation by these networks. We show that ReLU shallow neural networks with <em>m</em> hidden neurons can uniformly approximate functions from the Hölder space <span><math><msubsup><mrow><mi>W</mi></mrow><mrow><mo>∞</mo></mrow><mrow><mi>r</mi></mrow></msubsup><mo>(</mo><msup><mrow><mo>[</mo><mo>−</mo><mn>1</mn><mo>,</mo><mn>1</mn><mo>]</mo></mrow><mrow><mi>d</mi></mrow></msup><mo>)</mo></math></span> with rates <span><math><mi>O</mi><mo>(</mo><msup><mrow><mo>(</mo><mi>log</mi><mo>⁡</mo><mi>m</mi><mo>)</mo></mrow><mrow><mfrac><mrow><mn>1</mn></mrow><mrow><mn>2</mn></mrow></mfrac><mo>+</mo><mi>d</mi></mrow></msup><msup><mrow><mi>m</mi></mrow><mrow><mo>−</mo><mfrac><mrow><mi>r</mi></mrow><mrow><mi>d</mi></mrow></mfrac><mfrac><mrow><mi>d</mi><mo>+</mo><mn>2</mn></mrow><mrow><mi>d</mi><mo>+</mo><mn>4</mn></mrow></mfrac></mrow></msup><mo>)</mo></math></span> when <span><math><mi>r</mi><mo>&lt;</mo><mi>d</mi><mo>/</mo><mn>2</mn><mo>+</mo><mn>2</mn></math></span>. Such rates are very close to the optimal one <span><math><mi>O</mi><mo>(</mo><msup><mrow><mi>m</mi></mrow><mrow><mo>−</mo><mfrac><mrow><mi>r</mi></mrow><mrow><mi>d</mi></mrow></mfrac></mrow></msup><mo>)</mo></math></span> in the sense that <span><math><mfrac><mrow><mi>d</mi><mo>+</mo><mn>2</mn></mrow><mrow><mi>d</mi><mo>+</mo><mn>4</mn></mrow></mfrac></math></span> is close to 1, when the dimension <em>d</em> is large.</p></div>","PeriodicalId":50227,"journal":{"name":"Journal of Complexity","volume":"79 ","pages":"Article 101784"},"PeriodicalIF":1.7,"publicationDate":"2023-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49876977","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Approximating smooth and sparse functions by deep neural networks: Optimal approximation rates and saturation 用深度神经网络逼近平滑和稀疏函数:最优逼近率和饱和度
IF 1.7 2区 数学
Journal of Complexity Pub Date : 2023-07-27 DOI: 10.1016/j.jco.2023.101783
Xia Liu
{"title":"Approximating smooth and sparse functions by deep neural networks: Optimal approximation rates and saturation","authors":"Xia Liu","doi":"10.1016/j.jco.2023.101783","DOIUrl":"https://doi.org/10.1016/j.jco.2023.101783","url":null,"abstract":"<div><p><span><span>Constructing neural networks for function approximation is a classical and longstanding topic in </span>approximation theory. In this paper, we aim at constructing </span>deep neural networks with three hidden layers using a sigmoidal activation function to approximate smooth and sparse functions. Specifically, we prove that the constructed deep nets with controllable magnitude of free parameters can reach the optimal approximation rate in approximating both smooth and sparse functions. In particular, we prove that neural networks with three hidden layers can avoid the phenomenon of saturation, i.e., the phenomenon that for some neural network architectures, the approximation rate stops improving for functions of very high smoothness.</p></div>","PeriodicalId":50227,"journal":{"name":"Journal of Complexity","volume":"79 ","pages":"Article 101783"},"PeriodicalIF":1.7,"publicationDate":"2023-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49876980","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Worst case tractability of linear problems in the presence of noise: Linear information 存在噪声的线性问题的最坏情况可跟踪性:线性信息
IF 1.7 2区 数学
Journal of Complexity Pub Date : 2023-07-26 DOI: 10.1016/j.jco.2023.101782
Leszek Plaskota, Paweł Siedlecki
{"title":"Worst case tractability of linear problems in the presence of noise: Linear information","authors":"Leszek Plaskota,&nbsp;Paweł Siedlecki","doi":"10.1016/j.jco.2023.101782","DOIUrl":"https://doi.org/10.1016/j.jco.2023.101782","url":null,"abstract":"<div><p><span>We study the worst case tractability of multivariate linear problems defined on separable Hilbert spaces. Information about a problem instance consists of noisy evaluations of arbitrary bounded </span>linear functionals, where the noise is either deterministic or random. The cost of a single evaluation depends on its precision and is controlled by a cost function. We establish mutual interactions between tractability of a problem with noisy information, the cost function, and tractability of the same problem, but with exact information.</p></div>","PeriodicalId":50227,"journal":{"name":"Journal of Complexity","volume":"79 ","pages":"Article 101782"},"PeriodicalIF":1.7,"publicationDate":"2023-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49876976","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
On the complexity of a unified convergence analysis for iterative methods 关于复杂性的一种统一收敛分析迭代方法
IF 1.7 2区 数学
Journal of Complexity Pub Date : 2023-07-11 DOI: 10.1016/j.jco.2023.101781
Ioannis K. Argyros , Stepan Shakhno , Samundra Regmi , Halyna Yarmola
{"title":"On the complexity of a unified convergence analysis for iterative methods","authors":"Ioannis K. Argyros ,&nbsp;Stepan Shakhno ,&nbsp;Samundra Regmi ,&nbsp;Halyna Yarmola","doi":"10.1016/j.jco.2023.101781","DOIUrl":"https://doi.org/10.1016/j.jco.2023.101781","url":null,"abstract":"<div><p><span><span>A local and a semi-local convergence of general iterative methods for solving nonlinear operator equations in </span>Banach spaces is developed under </span><em>ω</em>-continuity conditions. Our approach unifies existing results and provides a new way of studying iterative methods. The main idea is to find a more accurate domain containing the iterates. No extra effort is used to obtain this. Also, the results of the numerical experiments are given that confirm obtained theoretical estimates.</p></div>","PeriodicalId":50227,"journal":{"name":"Journal of Complexity","volume":"79 ","pages":"Article 101781"},"PeriodicalIF":1.7,"publicationDate":"2023-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49876979","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Approximating smooth and sparse functions by deep neural networks: Optimal approximation rates and saturation 用深度神经网络逼近平滑和稀疏函数:最优逼近率和饱和度
IF 1.7 2区 数学
Journal of Complexity Pub Date : 2023-07-01 DOI: 10.1016/j.jco.2023.101783
Xia Liu
{"title":"Approximating smooth and sparse functions by deep neural networks: Optimal approximation rates and saturation","authors":"Xia Liu","doi":"10.1016/j.jco.2023.101783","DOIUrl":"https://doi.org/10.1016/j.jco.2023.101783","url":null,"abstract":"","PeriodicalId":50227,"journal":{"name":"Journal of Complexity","volume":"79 1","pages":"101783"},"PeriodicalIF":1.7,"publicationDate":"2023-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54746300","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
The rate of convergence for sparse and low-rank quantile trace regression 稀疏和低秩分位数迹回归的收敛速度
IF 1.7 2区 数学
Journal of Complexity Pub Date : 2023-06-19 DOI: 10.1016/j.jco.2023.101778
Xiangyong Tan , Ling Peng , Peiwen Xiao , Qing Liu , Xiaohui Liu
{"title":"The rate of convergence for sparse and low-rank quantile trace regression","authors":"Xiangyong Tan ,&nbsp;Ling Peng ,&nbsp;Peiwen Xiao ,&nbsp;Qing Liu ,&nbsp;Xiaohui Liu","doi":"10.1016/j.jco.2023.101778","DOIUrl":"https://doi.org/10.1016/j.jco.2023.101778","url":null,"abstract":"<div><p>Trace regression models are widely used in applications involving panel data, images, genomic microarrays, etc., where high-dimensional covariates<span> are often involved. However, the existing research involving high-dimensional covariates focuses mainly on the condition mean model. In this paper, we extend the trace regression model to the quantile trace regression model when the parameter is a matrix of simultaneously low rank and row (column) sparsity. The convergence rate of the penalized estimator is derived under mild conditions. Simulations, as well as a real data application, are also carried out for illustration.</span></p></div>","PeriodicalId":50227,"journal":{"name":"Journal of Complexity","volume":"79 ","pages":"Article 101778"},"PeriodicalIF":1.7,"publicationDate":"2023-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49876978","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Optimal recovery and volume estimates 最佳恢复和容量估计
IF 1.7 2区 数学
Journal of Complexity Pub Date : 2023-06-17 DOI: 10.1016/j.jco.2023.101780
Alexander Kushpel
{"title":"Optimal recovery and volume estimates","authors":"Alexander Kushpel","doi":"10.1016/j.jco.2023.101780","DOIUrl":"https://doi.org/10.1016/j.jco.2023.101780","url":null,"abstract":"<div><p>We study volumes of sections of convex origin-symmetric bodies in <span><math><msup><mrow><mi>R</mi></mrow><mrow><mi>n</mi></mrow></msup></math></span><span> induced by orthonormal systems on probability spaces. The approach is based on volume estimates of John-Löwner ellipsoids and expectations of norms induced by the respective systems. The estimates obtained allow us to establish lower bounds for the radii of sections which gives lower bounds for Gelfand widths (or linear cowidths). As an application we offer a new method of evaluation of Gelfand and Kolmogorov widths of multiplier operators. In particular, we establish sharp orders of widths of standard Sobolev classes </span><span><math><msubsup><mrow><mi>W</mi></mrow><mrow><mi>p</mi></mrow><mrow><mi>γ</mi></mrow></msubsup></math></span>, <span><math><mi>γ</mi><mo>&gt;</mo><mn>0</mn></math></span> in <span><math><msub><mrow><mi>L</mi></mrow><mrow><mi>q</mi></mrow></msub></math></span> on two-point homogeneous spaces in the difficult case, i.e. if <span><math><mn>1</mn><mo>&lt;</mo><mi>q</mi><mo>≤</mo><mi>p</mi><mo>≤</mo><mo>∞</mo></math></span>.</p></div>","PeriodicalId":50227,"journal":{"name":"Journal of Complexity","volume":"79 ","pages":"Article 101780"},"PeriodicalIF":1.7,"publicationDate":"2023-06-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49877027","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信