J. Mach. Learn. Res.最新文献

筛选
英文 中文
Learning theory of randomized Kaczmarz algorithm 随机化Kaczmarz算法的学习理论
J. Mach. Learn. Res. Pub Date : 2015-01-01 DOI: 10.5555/2789272.2912105
Junhong Lin, Ding-Xuan Zhou
{"title":"Learning theory of randomized Kaczmarz algorithm","authors":"Junhong Lin, Ding-Xuan Zhou","doi":"10.5555/2789272.2912105","DOIUrl":"https://doi.org/10.5555/2789272.2912105","url":null,"abstract":"A relaxed randomized Kaczmarz algorithm is investigated in a least squares regression setting by a learning theory approach. When the sampling values are accurate and the regression function (conditional means) is linear, such an algorithm has been well studied in the community of non-uniform sampling. In this paper, we are mainly interested in the different case of either noisy random measurements or a nonlinear regression function. In this case, we show that relaxation is needed. A necessary and sufficient condition on the sequence of relaxation parameters or step sizes for the convergence of the algorithm in expectation is presented. Moreover, polynomial rates of convergence, both in expectation and in probability, are provided explicitly. As a result, the almost sure convergence of the algorithm is proved by applying the Borel-Cantelli Lemma.","PeriodicalId":14794,"journal":{"name":"J. Mach. Learn. Res.","volume":"188 1","pages":"3341-3365"},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73265887","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 30
CEKA: a tool for mining the wisdom of crowds CEKA:挖掘群体智慧的工具
J. Mach. Learn. Res. Pub Date : 2015-01-01 DOI: 10.5555/2789272.2912090
J. Zhang, V. Sheng, B. Nicholson, Xindong Wu
{"title":"CEKA: a tool for mining the wisdom of crowds","authors":"J. Zhang, V. Sheng, B. Nicholson, Xindong Wu","doi":"10.5555/2789272.2912090","DOIUrl":"https://doi.org/10.5555/2789272.2912090","url":null,"abstract":"CEKA is a software package for developers and researchers to mine the wisdom of crowds. It makes the entire knowledge discovery procedure much easier, including analyzing qualities of workers, simulating labeling behaviors, inferring true class labels of instances, filtering and correcting mislabeled instances (noise), building learning models and evaluating them. It integrates a set of state-of-the-art inference algorithms, a set of general noise handling algorithms, and abundant functions for model training and evaluation. CEKA is written in Java with core classes being compatible with the well-known machine learning tool WEKA, which makes the utilization of the functions in WEKA much easier.","PeriodicalId":14794,"journal":{"name":"J. Mach. Learn. Res.","volume":"10 1","pages":"2853-2858"},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85227326","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 29
Decision boundary for discrete Bayesian network classifiers 离散贝叶斯网络分类器的决策边界
J. Mach. Learn. Res. Pub Date : 2015-01-01 DOI: 10.5555/2789272.2912086
Gherardo Varando, C. Bielza, P. Larrañaga
{"title":"Decision boundary for discrete Bayesian network classifiers","authors":"Gherardo Varando, C. Bielza, P. Larrañaga","doi":"10.5555/2789272.2912086","DOIUrl":"https://doi.org/10.5555/2789272.2912086","url":null,"abstract":"Bayesian network classifiers are a powerful machine learning tool. In order to evaluate the expressive power of these models, we compute families of polynomials that sign-represent decision functions induced by Bayesian network classifiers. We prove that those families are linear combinations of products of Lagrange basis polynomials. In absence of V-structures in the predictor sub-graph, we are also able to prove that this family of polynomials does indeed characterize the specific classifier considered. We then use this representation to bound the number of decision functions representable by Bayesian network classifiers with a given structure.","PeriodicalId":14794,"journal":{"name":"J. Mach. Learn. Res.","volume":"62 1","pages":"2725-2749"},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77806302","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 22
Geometric intuition and algorithms for Ev-SVM Ev-SVM的几何直觉与算法
J. Mach. Learn. Res. Pub Date : 2015-01-01 DOI: 10.5555/2789272.2789283
Á. Jiménez, A. Takeda, J. Lázaro
{"title":"Geometric intuition and algorithms for Ev-SVM","authors":"Á. Jiménez, A. Takeda, J. Lázaro","doi":"10.5555/2789272.2789283","DOIUrl":"https://doi.org/10.5555/2789272.2789283","url":null,"abstract":"In this work we address the Ev-SVM model proposed by Perez-Cruz et al. as an extension of the traditional v support vector classification model (v-SVM). Through an enhancement of the range of admissible values for the regularization parameter v, the Ev-SVM has been shown to be able to produce a wider variety of decision functions, giving rise to a better adaptability to the data. However, while a clear and intuitive geometric interpretation can be given for the v-SVM model as a nearest-point problem in reduced convex hulls (RCH-NPP), no previous work has been made in developing such intuition for the Ev-SVM model. In this paper we show how Ev-SVM can be reformulated as a geometrical problem that generalizes RCH-NPP, providing new insights into this model. Under this novel point of view, we propose the RapMinos algorithm, able to solve Ev-SVM more efficiently than the current methods. Furthermore, we show how RapMinos is able to address the Ev-SVM model for any choice of regularization norm lp ≥1 seamlessly, which further extends the SVM model flexibility beyond the usual Ev-SVM models.","PeriodicalId":14794,"journal":{"name":"J. Mach. Learn. Res.","volume":"29 1","pages":"323-369"},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81136110","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
PAC optimal MDP planning with application to invasive species management PAC优化MDP规划及其在入侵物种管理中的应用
J. Mach. Learn. Res. Pub Date : 2015-01-01 DOI: 10.5555/2789272.2912119
Majid Alkaee Taleghan, Thomas G. Dietterich, Mark Crowley, K. Hall, H. Albers
{"title":"PAC optimal MDP planning with application to invasive species management","authors":"Majid Alkaee Taleghan, Thomas G. Dietterich, Mark Crowley, K. Hall, H. Albers","doi":"10.5555/2789272.2912119","DOIUrl":"https://doi.org/10.5555/2789272.2912119","url":null,"abstract":"In a simulator-defined MDP, the Markovian dynamics and rewards are provided in the form of a simulator from which samples can be drawn. This paper studies MDP planning algorithms that attempt to minimize the number of simulator calls before terminating and outputting a policy that is approximately optimal with high probability. The paper introduces two heuristics for efficient exploration and an improved confidence interval that enables earlier termination with probabilistic guarantees. We prove that the heuristics and the confidence interval are sound and produce with high probability an approximately optimal policy in polynomial time. Experiments on two benchmark problems and two instances of an invasive species management problem show that the improved confidence intervals and the new search heuristics yield reductions of between 8% and 47% in the number of simulator calls required to reach near-optimal policies.","PeriodicalId":14794,"journal":{"name":"J. Mach. Learn. Res.","volume":"31 1","pages":"3877-3903"},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79032748","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 22
Learning using privileged information: similarity control and knowledge transfer 利用特权信息学习:相似性控制和知识转移
J. Mach. Learn. Res. Pub Date : 2015-01-01 DOI: 10.5555/2789272.2886814
V. Vapnik, R. Izmailov
{"title":"Learning using privileged information: similarity control and knowledge transfer","authors":"V. Vapnik, R. Izmailov","doi":"10.5555/2789272.2886814","DOIUrl":"https://doi.org/10.5555/2789272.2886814","url":null,"abstract":"This paper describes a new paradigm of machine learning, in which Intelligent Teacher is involved. During training stage, Intelligent Teacher provides Student with information that contains, along with classification of each example, additional privileged information (for example, explanation) of this example. The paper describes two mechanisms that can be used for significantly accelerating the speed of Student's learning using privileged information: (1) correction of Student's concepts of similarity between examples, and (2) direct Teacher-Student knowledge transfer.","PeriodicalId":14794,"journal":{"name":"J. Mach. Learn. Res.","volume":"7 1","pages":"2023-2049"},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76458118","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 344
Predicting a switching sequence of graph labelings 预测图标记的切换顺序
J. Mach. Learn. Res. Pub Date : 2015-01-01 DOI: 10.5555/2789272.2886813
M. Herbster, Stephen Pasteris, M. Pontil
{"title":"Predicting a switching sequence of graph labelings","authors":"M. Herbster, Stephen Pasteris, M. Pontil","doi":"10.5555/2789272.2886813","DOIUrl":"https://doi.org/10.5555/2789272.2886813","url":null,"abstract":"We study the problem of predicting online the labeling of a graph. We consider a novel setting for this problem in which, in addition to observing vertices and labels on the graph, we also observe a sequence of just vertices on a second graph. A latent labeling of the second graph selects one of K labelings to be active on the first graph. We propose a polynomial time algorithm for online prediction in this setting and derive a mistake bound for the algorithm. The bound is controlled by the geometric cut of the observed and latent labelings, as well as the resistance diameters of the graphs. When specialized to multitask prediction and online switching problems the bound gives new and sharper results under certain conditions.","PeriodicalId":14794,"journal":{"name":"J. Mach. Learn. Res.","volume":"95 1","pages":"2003-2022"},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79547396","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Absent data generating classifier for imbalanced class sizes 不平衡类大小的缺席数据生成分类器
J. Mach. Learn. Res. Pub Date : 2015-01-01 DOI: 10.5555/2789272.2912085
Arash Pourhabib, B. Mallick, Yu Ding
{"title":"Absent data generating classifier for imbalanced class sizes","authors":"Arash Pourhabib, B. Mallick, Yu Ding","doi":"10.5555/2789272.2912085","DOIUrl":"https://doi.org/10.5555/2789272.2912085","url":null,"abstract":"We propose an algorithm for two-class classification problems when the training data are imbalanced. This means the number of training instances in one of the classes is so low that the conventional classification algorithms become ineffective in detecting the minority class. We present a modification of the kernel Fisher discriminant analysis such that the imbalanced nature of the problem is explicitly addressed in the new algorithm formulation. The new algorithm exploits the properties of the existing minority points to learn the effects of other minority data points, had they actually existed. The algorithm proceeds iteratively by employing the learned properties and conditional sampling in such a way that it generates sufficient artificial data points for the minority set, thus enhancing the detection probability of the minority class. Implementing the proposed method on a number of simulated and real data sets, we show that our proposed method performs competitively compared to a set of alternative state-of-the-art imbalanced classification algorithms.","PeriodicalId":14794,"journal":{"name":"J. Mach. Learn. Res.","volume":"100 1","pages":"2695-2724"},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85908770","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 24
pyGPs: a Python library for Gaussian process regression and classification pyGPs:用于高斯过程回归和分类的Python库
J. Mach. Learn. Res. Pub Date : 2015-01-01 DOI: 10.5555/2789272.2912082
Marion Neumann, Shan Huang, D. Marthaler, K. Kersting
{"title":"pyGPs: a Python library for Gaussian process regression and classification","authors":"Marion Neumann, Shan Huang, D. Marthaler, K. Kersting","doi":"10.5555/2789272.2912082","DOIUrl":"https://doi.org/10.5555/2789272.2912082","url":null,"abstract":"We introduce pyGPs, an object-oriented implementation of Gaussian processes (gps) for machine learning. The library provides a wide range of functionalities reaching from simple gp specification via mean and covariance and gp inference to more complex implementations of hyperparameter optimization, sparse approximations, and graph based learning. Using Python we focus on usability for both \"users\" and \"researchers\". Our main goal is to offer a user-friendly and flexible implementation of GPs for machine learning.","PeriodicalId":14794,"journal":{"name":"J. Mach. Learn. Res.","volume":"6 1","pages":"2611-2616"},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74080103","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 24
SAMOA: scalable advanced massive online analysis 萨摩亚:可扩展的高级大规模在线分析
J. Mach. Learn. Res. Pub Date : 2015-01-01 DOI: 10.5555/2789272.2789277
G. D. F. Morales, A. Bifet
{"title":"SAMOA: scalable advanced massive online analysis","authors":"G. D. F. Morales, A. Bifet","doi":"10.5555/2789272.2789277","DOIUrl":"https://doi.org/10.5555/2789272.2789277","url":null,"abstract":"SAMOA (SCALABLE ADVANCED MASSIVE ONLINE ANALYSIS) is a platform for mining big data streams. It provides a collection of distributed streaming algorithms for the most common data mining and machine learning tasks such as classification, clustering, and regression, as well as programming abstractions to develop new algorithms. It features a pluggable architecture that allows it to run on several distributed stream processing engines such as Storm, S4, and Samza. samoa is written in Java, is open source, and is available at http://samoa-project.net under the Apache Software License version 2.0.","PeriodicalId":14794,"journal":{"name":"J. Mach. Learn. Res.","volume":"28 1","pages":"149-153"},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74337170","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 177
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信