Journal of Machine Learning Research最新文献

筛选
英文 中文
Effect-Invariant Mechanisms for Policy Generalization. 政策通用化的效应不变机制。
IF 4.3 3区 计算机科学
Journal of Machine Learning Research Pub Date : 2024-01-01
Sorawit Saengkyongam, Niklas Pfister, Predrag Klasnja, Susan Murphy, Jonas Peters
{"title":"Effect-Invariant Mechanisms for Policy Generalization.","authors":"Sorawit Saengkyongam, Niklas Pfister, Predrag Klasnja, Susan Murphy, Jonas Peters","doi":"","DOIUrl":"","url":null,"abstract":"<p><p>Policy learning is an important component of many real-world learning systems. A major challenge in policy learning is how to adapt efficiently to unseen environments or tasks. Recently, it has been suggested to exploit invariant conditional distributions to learn models that generalize better to unseen environments. However, assuming invariance of entire conditional distributions (which we call full invariance) may be too strong of an assumption in practice. In this paper, we introduce a relaxation of full invariance called effect-invariance (e-invariance for short) and prove that it is sufficient, under suitable assumptions, for zero-shot policy generalization. We also discuss an extension that exploits e-invariance when we have a small sample from the test environment, enabling few-shot policy generalization. Our work does not assume an underlying causal graph or that the data are generated by a structural causal model; instead, we develop testing procedures to test e-invariance directly from data. We present empirical results using simulated data and a mobile health intervention dataset to demonstrate the effectiveness of our approach.</p>","PeriodicalId":50161,"journal":{"name":"Journal of Machine Learning Research","volume":null,"pages":null},"PeriodicalIF":4.3,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11286230/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141857003","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Nonparametric Regression for 3D Point Cloud Learning. 用于 3D 点云学习的非参数回归。
IF 4.3 3区 计算机科学
Journal of Machine Learning Research Pub Date : 2024-01-01
Xinyi Li, Shan Yu, Yueying Wang, Guannan Wang, Li Wang, Ming-Jun Lai
{"title":"Nonparametric Regression for 3D Point Cloud Learning.","authors":"Xinyi Li, Shan Yu, Yueying Wang, Guannan Wang, Li Wang, Ming-Jun Lai","doi":"","DOIUrl":"","url":null,"abstract":"<p><p>In recent years, there has been an exponentially increased amount of point clouds collected with irregular shapes in various areas. Motivated by the importance of solid modeling for point clouds, we develop a novel and efficient smoothing tool based on multivariate splines over the triangulation to extract the underlying signal and build up a 3D solid model from the point cloud. The proposed method can denoise or deblur the point cloud effectively, provide a multi-resolution reconstruction of the actual signal, and handle sparse and irregularly distributed point clouds to recover the underlying trajectory. In addition, our method provides a natural way of numerosity data reduction. We establish the theoretical guarantees of the proposed method, including the convergence rate and asymptotic normality of the estimator, and show that the convergence rate achieves optimal nonparametric convergence. We also introduce a bootstrap method to quantify the uncertainty of the estimators. Through extensive simulation studies and a real data example, we demonstrate the superiority of the proposed method over traditional smoothing methods in terms of estimation accuracy and efficiency of data reduction.</p>","PeriodicalId":50161,"journal":{"name":"Journal of Machine Learning Research","volume":null,"pages":null},"PeriodicalIF":4.3,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11465206/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142401809","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Convergence for nonconvex ADMM, with applications to CT imaging. 非凸 ADMM 的收敛性,并应用于 CT 成像。
IF 6 3区 计算机科学
Journal of Machine Learning Research Pub Date : 2024-01-01
Rina Foygel Barber, Emil Y Sidky
{"title":"Convergence for nonconvex ADMM, with applications to CT imaging.","authors":"Rina Foygel Barber, Emil Y Sidky","doi":"","DOIUrl":"","url":null,"abstract":"<p><p>The alternating direction method of multipliers (ADMM) algorithm is a powerful and flexible tool for complex optimization problems of the form <math><mi>m</mi> <mi>i</mi> <mi>n</mi> <mo>{</mo> <mi>f</mi> <mo>(</mo> <mi>x</mi> <mo>)</mo> <mo>+</mo> <mi>g</mi> <mo>(</mo> <mi>y</mi> <mo>)</mo> <mspace></mspace> <mo>:</mo> <mspace></mspace> <mi>A</mi> <mi>x</mi> <mo>+</mo> <mi>B</mi> <mi>y</mi> <mo>=</mo> <mi>c</mi> <mo>}</mo></math> . ADMM exhibits robust empirical performance across a range of challenging settings including nonsmoothness and nonconvexity of the objective functions <math><mi>f</mi></math> and <math><mi>g</mi></math> , and provides a simple and natural approach to the inverse problem of image reconstruction for computed tomography (CT) imaging. From the theoretical point of view, existing results for convergence in the nonconvex setting generally assume smoothness in at least one of the component functions in the objective. In this work, our new theoretical results provide convergence guarantees under a restricted strong convexity assumption without requiring smoothness or differentiability, while still allowing differentiable terms to be treated approximately if needed. We validate these theoretical results empirically, with a simulated example where both <math><mi>f</mi></math> and <math><mi>g</mi></math> are nondifferentiable-and thus outside the scope of existing theory-as well as a simulated CT image reconstruction problem.</p>","PeriodicalId":50161,"journal":{"name":"Journal of Machine Learning Research","volume":null,"pages":null},"PeriodicalIF":6.0,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11155492/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141297149","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Batch Normalization Preconditioning for Stochastic Gradient Langevin Dynamics 随机梯度朗格万动力学的批归一化预处理
IF 6 3区 计算机科学
Journal of Machine Learning Research Pub Date : 2023-06-01 DOI: 10.4208/jml.220726a
Susanne Lange, Wei Deng, Q. Ye, Guang Lin
{"title":"Batch Normalization Preconditioning for Stochastic Gradient Langevin Dynamics","authors":"Susanne Lange, Wei Deng, Q. Ye, Guang Lin","doi":"10.4208/jml.220726a","DOIUrl":"https://doi.org/10.4208/jml.220726a","url":null,"abstract":"","PeriodicalId":50161,"journal":{"name":"Journal of Machine Learning Research","volume":null,"pages":null},"PeriodicalIF":6.0,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76596604","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
A Local Convergence Theory for the Stochastic Gradient Descent Method in Non-Convex Optimization with NonIsolated Local Minima 具有非孤立局部极小值的非凸优化随机梯度下降法的局部收敛理论
3区 计算机科学
Journal of Machine Learning Research Pub Date : 2023-06-01 DOI: 10.4208/jml.230106
Taehee Ko and Xiantao Li
{"title":"A Local Convergence Theory for the Stochastic Gradient Descent Method in Non-Convex Optimization with NonIsolated Local Minima","authors":"Taehee Ko and Xiantao Li","doi":"10.4208/jml.230106","DOIUrl":"https://doi.org/10.4208/jml.230106","url":null,"abstract":"Non-convex loss functions arise frequently in modern machine learning, and for the theoretical analysis of stochastic optimization methods, the presence of non-isolated minima presents a unique challenge that has remained under-explored. In this paper, we study the local convergence of the stochastic gradient descent method to non-isolated global minima. Under mild assumptions, we estimate the probability for the iterations to stay near the minima by adopting the notion of stochastic stability. After establishing such stability, we present the lower bound complexity in terms of various error criteria for a given error tolerance ǫ and a failure probability γ .","PeriodicalId":50161,"journal":{"name":"Journal of Machine Learning Research","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135674517","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Efficient Anti-Symmetrization of a Neural Network Layer by Taming the Sign Problem 基于驯服符号问题的神经网络层的有效抗对称
3区 计算机科学
Journal of Machine Learning Research Pub Date : 2023-06-01 DOI: 10.4208/jml.230703
Nilin Abrahamsen and Lin Lin
{"title":"Efficient Anti-Symmetrization of a Neural Network Layer by Taming the Sign Problem","authors":"Nilin Abrahamsen and Lin Lin","doi":"10.4208/jml.230703","DOIUrl":"https://doi.org/10.4208/jml.230703","url":null,"abstract":"Explicit antisymmetrization of a neural network is a potential candidate for a universal function approximator for generic antisymmetric functions, which are ubiquitous in quantum physics. However, this procedure is a priori factorially costly to implement, making it impractical for large numbers of particles. The strategy also suffers from a sign problem. Namely, due to near-exact cancellation of positive and negative contributions, the magnitude of the antisymmetrized function may be significantly smaller than before anti-symmetrization. We show that the anti-symmetric projection of a two-layer neural network can be evaluated efficiently, opening the door to using a generic antisymmetric layer as a building block in anti-symmetric neural network Ansatzes. This approximation is effective when the sign problem is controlled, and we show that this property depends crucially the choice of activation function under standard Xavier/He initialization methods. As a consequence, using a smooth activation function requires re-scaling of the neural network weights compared to standard initializations.","PeriodicalId":50161,"journal":{"name":"Journal of Machine Learning Research","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135144017","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Brief Survey on the Approximation Theory for Sequence Modelling 序列建模的近似理论综述
3区 计算机科学
Journal of Machine Learning Research Pub Date : 2023-06-01 DOI: 10.4208/jml.221221
Haotian Jiang, Qianxiao Li, Zhong Li null, Shida Wang
{"title":"A Brief Survey on the Approximation Theory for Sequence Modelling","authors":"Haotian Jiang, Qianxiao Li, Zhong Li null, Shida Wang","doi":"10.4208/jml.221221","DOIUrl":"https://doi.org/10.4208/jml.221221","url":null,"abstract":"","PeriodicalId":50161,"journal":{"name":"Journal of Machine Learning Research","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135381134","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Reinforcement Learning with Function Approximation: From Linear to Nonlinear 函数逼近的强化学习:从线性到非线性
3区 计算机科学
Journal of Machine Learning Research Pub Date : 2023-06-01 DOI: 10.4208/jml.230105
Jihao Long and Jiequn Han
{"title":"Reinforcement Learning with Function Approximation: From Linear to Nonlinear","authors":"Jihao Long and Jiequn Han","doi":"10.4208/jml.230105","DOIUrl":"https://doi.org/10.4208/jml.230105","url":null,"abstract":"","PeriodicalId":50161,"journal":{"name":"Journal of Machine Learning Research","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135887743","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Why Self-Attention is Natural for Sequence-to-Sequence Problems? A Perspective from Symmetries 为什么自我关注是序列对序列问题的自然表现?从对称角度看问题
3区 计算机科学
Journal of Machine Learning Research Pub Date : 2023-06-01 DOI: 10.4208/jml.221206
Chao Ma and Lexing Ying null
{"title":"Why Self-Attention is Natural for Sequence-to-Sequence Problems? A Perspective from Symmetries","authors":"Chao Ma and Lexing Ying null","doi":"10.4208/jml.221206","DOIUrl":"https://doi.org/10.4208/jml.221206","url":null,"abstract":"","PeriodicalId":50161,"journal":{"name":"Journal of Machine Learning Research","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135142632","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Selective inference for k-means clustering. k-means 聚类的选择性推理。
IF 4.3 3区 计算机科学
Journal of Machine Learning Research Pub Date : 2023-05-01
Yiqun T Chen, Daniela M Witten
{"title":"<ArticleTitle xmlns:ns0=\"http://www.w3.org/1998/Math/MathML\">Selective inference for <ns0:math><ns0:mi>k</ns0:mi></ns0:math>-means clustering.","authors":"Yiqun T Chen, Daniela M Witten","doi":"","DOIUrl":"","url":null,"abstract":"<p><p>We consider the problem of testing for a difference in means between clusters of observations identified via <math><mi>k</mi></math>-means clustering. In this setting, classical hypothesis tests lead to an inflated Type I error rate. In recent work, Gao et al. (2022) considered a related problem in the context of hierarchical clustering. Unfortunately, their solution is highly-tailored to the context of hierarchical clustering, and thus cannot be applied in the setting of <math><mi>k</mi></math>-means clustering. In this paper, we propose a p-value that conditions on all of the intermediate clustering assignments in the <math><mi>k</mi></math>-means algorithm. We show that the p-value controls the selective Type I error for a test of the difference in means between a pair of clusters obtained using <math><mi>k</mi></math>-means clustering in finite samples, and can be efficiently computed. We apply our proposal on hand-written digits data and on single-cell RNA-sequencing data.</p>","PeriodicalId":50161,"journal":{"name":"Journal of Machine Learning Research","volume":null,"pages":null},"PeriodicalIF":4.3,"publicationDate":"2023-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10805457/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139543526","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信