Turbo-SMT: Accelerating Coupled Sparse Matrix-Tensor Factorizations by 200×.

Evangelos E Papalexakis, Christos Faloutsos, Tom M Mitchell, Partha Pratim Talukdar, Nicholas D Sidiropoulos, Brian Murphy
{"title":"Turbo-SMT: Accelerating Coupled Sparse Matrix-Tensor Factorizations by 200×.","authors":"Evangelos E Papalexakis,&nbsp;Christos Faloutsos,&nbsp;Tom M Mitchell,&nbsp;Partha Pratim Talukdar,&nbsp;Nicholas D Sidiropoulos,&nbsp;Brian Murphy","doi":"10.1137/1.9781611973440.14","DOIUrl":null,"url":null,"abstract":"<p><p>How can we correlate the neural activity in the human brain as it responds to typed words, with properties of these terms (like 'edible', 'fits in hand')? In short, we want to find latent variables, that jointly explain both the brain activity, as well as the behavioral responses. This is one of many settings of the <i>Coupled Matrix-Tensor Factorization</i> (CMTF) problem. Can we accelerate <i>any</i> CMTF solver, so that it runs within a few minutes instead of tens of hours to a day, while maintaining good accuracy? We introduce TURBO-SMT, a meta-method capable of doing exactly that: it boosts the performance of <i>any</i> CMTF algorithm, by up to <i>200</i>×, along with an up to <i>65 fold</i> increase in sparsity, with comparable accuracy to the baseline. We apply TURBO-SMT to BRAINQ, a dataset consisting of a (nouns, brain voxels, human subjects) tensor and a (nouns, properties) matrix, with coupling along the nouns dimension. TURBO-SMT is able to find meaningful latent variables, as well as to predict brain activity with competitive accuracy.</p>","PeriodicalId":74533,"journal":{"name":"Proceedings of the ... SIAM International Conference on Data Mining. SIAM International Conference on Data Mining","volume":"2014 ","pages":"118-126"},"PeriodicalIF":0.0000,"publicationDate":"2014-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1137/1.9781611973440.14","citationCount":"58","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ... SIAM International Conference on Data Mining. SIAM International Conference on Data Mining","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1137/1.9781611973440.14","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 58

Abstract

How can we correlate the neural activity in the human brain as it responds to typed words, with properties of these terms (like 'edible', 'fits in hand')? In short, we want to find latent variables, that jointly explain both the brain activity, as well as the behavioral responses. This is one of many settings of the Coupled Matrix-Tensor Factorization (CMTF) problem. Can we accelerate any CMTF solver, so that it runs within a few minutes instead of tens of hours to a day, while maintaining good accuracy? We introduce TURBO-SMT, a meta-method capable of doing exactly that: it boosts the performance of any CMTF algorithm, by up to 200×, along with an up to 65 fold increase in sparsity, with comparable accuracy to the baseline. We apply TURBO-SMT to BRAINQ, a dataset consisting of a (nouns, brain voxels, human subjects) tensor and a (nouns, properties) matrix, with coupling along the nouns dimension. TURBO-SMT is able to find meaningful latent variables, as well as to predict brain activity with competitive accuracy.

Abstract Image

Abstract Image

Abstract Image

Turbo-SMT: 200倍加速耦合稀疏矩阵张量分解。
当人脑对输入的单词做出反应时,我们如何将其神经活动与这些词的属性(如“可食用”、“适合拿在手里”)联系起来?简而言之,我们想找到潜在的变量,共同解释大脑活动,以及行为反应。这是耦合矩阵-张量分解(CMTF)问题的众多设置之一。我们能否加速任何CMTF求解器,使其在保持良好精度的情况下,在几分钟内而不是每天几十小时内运行?我们介绍TURBO-SMT,一种能够做到这一点的元方法:它将任何CMTF算法的性能提高了200倍,同时将稀疏性提高了65倍,并且具有与基线相当的精度。我们将TURBO-SMT应用于BRAINQ,这是一个由一个(名词,脑体素,人类受试者)张量和一个(名词,属性)矩阵组成的数据集,沿着名词维度进行耦合。TURBO-SMT能够找到有意义的潜在变量,并以具有竞争力的准确性预测大脑活动。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信