Iterative Collaborative Filtering for Sparse Noisy Tensor Estimation

D. Shah, C. Yu
{"title":"Iterative Collaborative Filtering for Sparse Noisy Tensor Estimation","authors":"D. Shah, C. Yu","doi":"10.1109/ALLERTON.2019.8919933","DOIUrl":null,"url":null,"abstract":"We consider the task of tensor estimation, i.e. estimating a low-rank 3-order $n \\times n \\times n$ tensor from noisy observations of randomly chosen entries in the sparse regime. In the context of matrix (2-order tensor) estimation, a variety of algorithms have been proposed and analyzed in the literature including the popular collaborative filtering algorithm that is extremely well utilized in practice. However, in the context of tensor estimation, there is limited progress. No natural extensions of collaborative filtering are known beyond “flattening” the tensor into a matrix and applying standard collaborative filtering. As the main contribution of this work, we introduce a generalization of the collaborative filtering algorithm for the setting of tensor estimation and argue that it achieves sample complexity that (nearly) matches the conjectured lower bound on the sample complexity. Interestingly, our generalization uses the matrix obtained from the “flattened” tensor to compute similarity as in the classical collaborative filtering but by defining a novel “graph” using it. The algorithm recovers the tensor with mean-squared-error (MSE) decaying to 0 as long as each entry is observed independently with probability $p= \\Omega(n^{-3/2+\\epsilon})$ for any arbitrarily small $\\epsilon > 0$. It turns out that $p = \\Omega(n^{-3/2})$ is the conjectured lower bound as well as “connectivity threshold” of graph considered to compute similarity in our algorithm.","PeriodicalId":120479,"journal":{"name":"2019 57th Annual Allerton Conference on Communication, Control, and Computing (Allerton)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"14","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 57th Annual Allerton Conference on Communication, Control, and Computing (Allerton)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ALLERTON.2019.8919933","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 14

Abstract

We consider the task of tensor estimation, i.e. estimating a low-rank 3-order $n \times n \times n$ tensor from noisy observations of randomly chosen entries in the sparse regime. In the context of matrix (2-order tensor) estimation, a variety of algorithms have been proposed and analyzed in the literature including the popular collaborative filtering algorithm that is extremely well utilized in practice. However, in the context of tensor estimation, there is limited progress. No natural extensions of collaborative filtering are known beyond “flattening” the tensor into a matrix and applying standard collaborative filtering. As the main contribution of this work, we introduce a generalization of the collaborative filtering algorithm for the setting of tensor estimation and argue that it achieves sample complexity that (nearly) matches the conjectured lower bound on the sample complexity. Interestingly, our generalization uses the matrix obtained from the “flattened” tensor to compute similarity as in the classical collaborative filtering but by defining a novel “graph” using it. The algorithm recovers the tensor with mean-squared-error (MSE) decaying to 0 as long as each entry is observed independently with probability $p= \Omega(n^{-3/2+\epsilon})$ for any arbitrarily small $\epsilon > 0$. It turns out that $p = \Omega(n^{-3/2})$ is the conjectured lower bound as well as “connectivity threshold” of graph considered to compute similarity in our algorithm.
稀疏噪声张量估计的迭代协同滤波
我们考虑张量估计的任务,即从稀疏区域中随机选择的条目的噪声观测中估计一个低秩3阶张量$n \times n \times n$。在矩阵(二阶张量)估计的背景下,文献中已经提出并分析了多种算法,其中包括在实践中得到很好应用的流行的协同滤波算法。然而,在张量估计方面,进展有限。除了将张量“扁平化”为矩阵并应用标准协同过滤之外,还没有已知的协同过滤的自然扩展。作为这项工作的主要贡献,我们引入了一种用于张量估计设置的协同滤波算法的推广,并认为它实现的样本复杂度(几乎)匹配样本复杂度的推测下界。有趣的是,我们的推广使用从“扁平”张量中获得的矩阵来计算相似度,就像在经典的协同过滤中一样,但通过定义一个新的“图”来使用它。该算法恢复张量,均方误差(MSE)衰减到0,只要每个条目以任意小$\epsilon > 0$的概率$p= \Omega(n^{-3/2+\epsilon})$独立观察。结果表明,$p = \Omega(n^{-3/2})$是我们算法中计算相似度所考虑的图的推测下界和“连通性阈值”。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信