Found. Trends Mach. Learn.最新文献

筛选
英文 中文
Tensor Regression 张量的回归
Found. Trends Mach. Learn. Pub Date : 2023-08-22 DOI: 10.1561/2200000087
Jiani Liu, Ce Zhu, Zhen Long, Yipeng Liu
{"title":"Tensor Regression","authors":"Jiani Liu, Ce Zhu, Zhen Long, Yipeng Liu","doi":"10.1561/2200000087","DOIUrl":"https://doi.org/10.1561/2200000087","url":null,"abstract":"Regression analysis is a key area of interest in the field of data analysis and machine learning which is devoted to exploring the dependencies between variables, often using vectors. The emergence of high dimensional data in technologies such as neuroimaging, computer vision, climatology and social networks, has brought challenges to traditional data representation methods. Tensors, as high dimensional extensions of vectors, are considered as natural representations of high dimensional data. In this book, the authors provide a systematic study and analysis of tensor-based regression models and their applications in recent years. It groups and illustrates the existing tensor-based regression methods and covers the basics, core ideas, and theoretical characteristics of most tensor-based regression methods. In addition, readers can learn how to use existing tensor-based regression methods to solve specific regression tasks with multiway data, what datasets can be selected, and what software packages are available to start related work as soon as possible. Tensor Regression is the first thorough overview of the fundamentals, motivations, popular algorithms, strategies for efficient implementation, related applications, available datasets, and software resources for tensor-based regression analysis. It is essential reading for all students, researchers and practitioners of working on high dimensional data.","PeriodicalId":431372,"journal":{"name":"Found. Trends Mach. Learn.","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128073646","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
Tutorial on Amortized Optimization 平摊优化教程
Found. Trends Mach. Learn. Pub Date : 2022-02-01 DOI: 10.1561/9781638282099
Brandon Amos
{"title":"Tutorial on Amortized Optimization","authors":"Brandon Amos","doi":"10.1561/9781638282099","DOIUrl":"https://doi.org/10.1561/9781638282099","url":null,"abstract":"Optimization is a ubiquitous modeling tool and is often deployed in settings which repeatedly solve similar instances of the same problem. Amortized optimization methods use learning to predict the solutions to problems in these settings, exploiting the shared structure between similar problem instances. These methods have been crucial in variational inference and reinforcement learning and are capable of solving optimization problems many orders of magnitudes times faster than traditional optimization methods that do not use amortization. This tutorial presents an introduction to the amortized optimization foundations behind these advancements and overviews their applications in variational inference, sparse coding, gradient-based meta-learning, control, reinforcement learning, convex optimization, optimal transport, and deep equilibrium networks. The source code for this tutorial is available at https://github.com/facebookresearch/amortized-optimization-tutorial.","PeriodicalId":431372,"journal":{"name":"Found. Trends Mach. Learn.","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132460533","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Machine Learning for Automated Theorem Proving: Learning to Solve SAT and QSAT 自动定理证明的机器学习:学习解决SAT和QSAT
Found. Trends Mach. Learn. Pub Date : 2021-10-26 DOI: 10.1561/2200000081
S. Holden
{"title":"Machine Learning for Automated Theorem Proving: Learning to Solve SAT and QSAT","authors":"S. Holden","doi":"10.1561/2200000081","DOIUrl":"https://doi.org/10.1561/2200000081","url":null,"abstract":"","PeriodicalId":431372,"journal":{"name":"Found. Trends Mach. Learn.","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122192031","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
A unifying tutorial on Approximate Message Passing 关于近似消息传递的统一教程
Found. Trends Mach. Learn. Pub Date : 2021-05-05 DOI: 10.1561/2200000092
Oliver Y. Feng, R. Venkataramanan, Cynthia Rush, R. Samworth
{"title":"A unifying tutorial on Approximate Message Passing","authors":"Oliver Y. Feng, R. Venkataramanan, Cynthia Rush, R. Samworth","doi":"10.1561/2200000092","DOIUrl":"https://doi.org/10.1561/2200000092","url":null,"abstract":"Over the last decade or so, Approximate Message Passing (AMP) algorithms have become extremely popular in various structured high-dimensional statistical problems. The fact that the origins of these techniques can be traced back to notions of belief propagation in the statistical physics literature lends a certain mystique to the area for many statisticians. Our goal in this work is to present the main ideas of AMP from a statistical perspective, to illustrate the power and flexibility of the AMP framework. Along the way, we strengthen and unify many of the results in the existing literature.","PeriodicalId":431372,"journal":{"name":"Found. Trends Mach. Learn.","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131453127","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 44
Reinforcement Learning, Bit by Bit 一点一点的强化学习
Found. Trends Mach. Learn. Pub Date : 2021-03-06 DOI: 10.1561/2200000097
Xiuyuan Lu, Benjamin Van Roy, V. Dwaracherla, M. Ibrahimi, Ian Osband, Zheng Wen
{"title":"Reinforcement Learning, Bit by Bit","authors":"Xiuyuan Lu, Benjamin Van Roy, V. Dwaracherla, M. Ibrahimi, Ian Osband, Zheng Wen","doi":"10.1561/2200000097","DOIUrl":"https://doi.org/10.1561/2200000097","url":null,"abstract":"Reinforcement learning agents have demonstrated remarkable achievements in simulated environments. Data efficiency poses an impediment to carrying this success over to real environments. The design of data-efficient agents calls for a deeper understanding of information acquisition and representation. We discuss concepts and regret analysis that together offer principled guidance. This line of thinking sheds light on questions of what information to seek, how to seek that information, and what information to retain. To illustrate concepts, we design simple agents that build on them and present computational results that highlight data efficiency.","PeriodicalId":431372,"journal":{"name":"Found. Trends Mach. Learn.","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132683399","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 51
Data Analytics on Graphs Part I: Graphs and Spectra on Graphs 图上的数据分析第一部分:图和图上的谱
Found. Trends Mach. Learn. Pub Date : 2020-12-30 DOI: 10.1561/2200000078-1
L. Stanković, D. Mandic, M. Daković, M. Brajović, Bruno Scalzo, Shengxi Li, A. Constantinides
{"title":"Data Analytics on Graphs Part I: Graphs and Spectra on Graphs","authors":"L. Stanković, D. Mandic, M. Daković, M. Brajović, Bruno Scalzo, Shengxi Li, A. Constantinides","doi":"10.1561/2200000078-1","DOIUrl":"https://doi.org/10.1561/2200000078-1","url":null,"abstract":"The area of Data Analytics on graphs promises a paradigm shift, as we approach information processing of new classes of data which are typically acquired on irregular but structured domains (such as social networks, various ad-hoc sensor networks). Yet, despite the long history of Graph Theory, current approaches tend to focus on aspects of optimisation of graphs themselves rather than on eliciting strategies relevant to the objective application of the graph paradigm, such as detection, estimation, statistical and probabilistic inference, clustering and separation from signals and data acquired on graphs. In order to bridge this gap, we first revisit graph topologies from a Data Analytics point of view, to establish a taxonomy of graph networks through a linear algebraic formalism of graph topology (vertices, connections, directivity). This serves as a basis for spectral Ljubiša Stanković, Danilo Mandic, Miloš Daković, Miloš Brajović, Bruno Scalzo, Shengxi Li and Anthony G. Constantinides (2020), “Data Analytics on Graphs Part I: Graphs and Spectra on Graphs”, Foundations and Trends © in Machine Learning: Vol. 13, No. 1, pp 1–157. DOI: 10.1561/2200000078-1.","PeriodicalId":431372,"journal":{"name":"Found. Trends Mach. Learn.","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124544804","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 23
Data Analytics on Graphs Part III: Machine Learning on Graphs, from Graph Topology to Applications 图上的数据分析第三部分:图上的机器学习,从图拓扑到应用
Found. Trends Mach. Learn. Pub Date : 2020-12-30 DOI: 10.1561/2200000078-3
L. Stanković, D. Mandic, M. Daković, M. Brajović, Bruno Scalzo, Shengxi Li, A. Constantinides
{"title":"Data Analytics on Graphs Part III: Machine Learning on Graphs, from Graph Topology to Applications","authors":"L. Stanković, D. Mandic, M. Daković, M. Brajović, Bruno Scalzo, Shengxi Li, A. Constantinides","doi":"10.1561/2200000078-3","DOIUrl":"https://doi.org/10.1561/2200000078-3","url":null,"abstract":"Data Analytics on Graphs Part III: Machine Learning on Graphs, from Graph Topology to Applications","PeriodicalId":431372,"journal":{"name":"Found. Trends Mach. Learn.","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130477729","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 34
Data Analytics on Graphs Part II: Signals on Graphs 图上的数据分析第二部分:图上的信号
Found. Trends Mach. Learn. Pub Date : 2020-12-30 DOI: 10.1561/2200000078-2
L. Stanković, D. Mandic, M. Daković, M. Brajović, Bruno Scalzo, Shengxi Li, A. Constantinides
{"title":"Data Analytics on Graphs Part II: Signals on Graphs","authors":"L. Stanković, D. Mandic, M. Daković, M. Brajović, Bruno Scalzo, Shengxi Li, A. Constantinides","doi":"10.1561/2200000078-2","DOIUrl":"https://doi.org/10.1561/2200000078-2","url":null,"abstract":"","PeriodicalId":431372,"journal":{"name":"Found. Trends Mach. Learn.","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123033320","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 24
Spectral Methods for Data Science: A Statistical Perspective 数据科学的光谱方法:统计学视角
Found. Trends Mach. Learn. Pub Date : 2020-12-15 DOI: 10.1561/2200000079
Yuxin Chen, Yuejie Chi, Jianqing Fan, Cong Ma
{"title":"Spectral Methods for Data Science: A Statistical Perspective","authors":"Yuxin Chen, Yuejie Chi, Jianqing Fan, Cong Ma","doi":"10.1561/2200000079","DOIUrl":"https://doi.org/10.1561/2200000079","url":null,"abstract":"Spectral methods have emerged as a simple yet surprisingly effective approach for extracting information from massive, noisy and incomplete data. In a nutshell, spectral methods refer to a collection of algorithms built upon the eigenvalues (resp. singular values) and eigenvectors (resp. singular vectors) of some properly designed matrices constructed from data. A diverse array of applications have been found in machine learning, data science, and signal processing. Due to their simplicity and effectiveness, spectral methods are not only used as a stand-alone estimator, but also frequently employed to initialize other more sophisticated algorithms to improve performance. \u0000While the studies of spectral methods can be traced back to classical matrix perturbation theory and methods of moments, the past decade has witnessed tremendous theoretical advances in demystifying their efficacy through the lens of statistical modeling, with the aid of non-asymptotic random matrix theory. This monograph aims to present a systematic, comprehensive, yet accessible introduction to spectral methods from a modern statistical perspective, highlighting their algorithmic implications in diverse large-scale applications. In particular, our exposition gravitates around several central questions that span various applications: how to characterize the sample efficiency of spectral methods in reaching a target level of statistical accuracy, and how to assess their stability in the face of random noise, missing data, and adversarial corruptions? In addition to conventional $ell_2$ perturbation analysis, we present a systematic $ell_{infty}$ and $ell_{2,infty}$ perturbation theory for eigenspace and singular subspaces, which has only recently become available owing to a powerful \"leave-one-out\" analysis framework.","PeriodicalId":431372,"journal":{"name":"Found. Trends Mach. Learn.","volume":"66 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131488777","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 111
Divided Differences, Falling Factorials, and Discrete Splines: Another Look at Trend Filtering and Related Problems 可分差、下降阶乘和离散样条:趋势过滤和相关问题的另一个视角
Found. Trends Mach. Learn. Pub Date : 2020-03-09 DOI: 10.1561/2200000099
R. Tibshirani
{"title":"Divided Differences, Falling Factorials, and Discrete Splines: Another Look at Trend Filtering and Related Problems","authors":"R. Tibshirani","doi":"10.1561/2200000099","DOIUrl":"https://doi.org/10.1561/2200000099","url":null,"abstract":"This paper serves as a postscript of sorts to Tibshirani (2014); Wang et al. (2014), who developed continuous-time formulations and properties of trend filtering, a discrete-time smoothing tool proposed (independently) by Steidl et al. (2006); Kim et al. (2009). The central object of study is the falling factorial basis, as it was called by Tibshirani (2014); Wang et al. (2014). Its span turns out to be a space of piecewise polynomials that has a classical place in spline theory, called discrete splines (Mangasarian and Schumaker, 1971, 1973; Schumaker, 2007). At the Tibshirani (2014); Wang et al. (2014), we were not fully aware of these connections. The current paper attempts to rectify this by making these connections explicit, reviewing (and making use of) some of the important existing work on discrete splines, and contributing several new perspectives and new results on discrete splines along the way.","PeriodicalId":431372,"journal":{"name":"Found. Trends Mach. Learn.","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130686908","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信