Robust priors for regularized regression

IF 3 2区 心理学 Q1 PSYCHOLOGY
Sebastian Bobadilla-Suarez , Matt Jones , Bradley C. Love
{"title":"Robust priors for regularized regression","authors":"Sebastian Bobadilla-Suarez ,&nbsp;Matt Jones ,&nbsp;Bradley C. Love","doi":"10.1016/j.cogpsych.2021.101444","DOIUrl":null,"url":null,"abstract":"<div><p>Induction benefits from useful priors. Penalized regression approaches, like ridge regression, shrink weights toward zero but zero association is usually not a sensible prior. Inspired by simple and robust decision heuristics humans use, we constructed non-zero priors for penalized regression models that provide robust and interpretable solutions across several tasks. Our approach enables estimates from a constrained model to serve as a prior for a more general model, yielding a principled way to interpolate between models of differing complexity. We successfully applied this approach to a number of decision and classification problems, as well as analyzing simulated brain imaging data. Models with robust priors had excellent worst-case performance. Solutions followed from the form of the heuristic that was used to derive the prior. These new algorithms can serve applications in data analysis and machine learning, as well as help in understanding how people transition from novice to expert performance.</p></div>","PeriodicalId":50669,"journal":{"name":"Cognitive Psychology","volume":"132 ","pages":"Article 101444"},"PeriodicalIF":3.0000,"publicationDate":"2022-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8903146/pdf/","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cognitive Psychology","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0010028521000670","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY","Score":null,"Total":0}
引用次数: 1

Abstract

Induction benefits from useful priors. Penalized regression approaches, like ridge regression, shrink weights toward zero but zero association is usually not a sensible prior. Inspired by simple and robust decision heuristics humans use, we constructed non-zero priors for penalized regression models that provide robust and interpretable solutions across several tasks. Our approach enables estimates from a constrained model to serve as a prior for a more general model, yielding a principled way to interpolate between models of differing complexity. We successfully applied this approach to a number of decision and classification problems, as well as analyzing simulated brain imaging data. Models with robust priors had excellent worst-case performance. Solutions followed from the form of the heuristic that was used to derive the prior. These new algorithms can serve applications in data analysis and machine learning, as well as help in understanding how people transition from novice to expert performance.

Abstract Image

Abstract Image

Abstract Image

正则化回归的鲁棒先验
归纳受益于有用的先验。惩罚回归方法,如脊回归,将权重缩小到零,但零关联通常不是一个明智的先验。受人类使用的简单而稳健的决策启发式的启发,我们为惩罚回归模型构建了非零先验,这些模型提供了跨多个任务的稳健且可解释的解决方案。我们的方法使来自约束模型的估计能够作为更一般模型的先验,从而产生一种在不同复杂性的模型之间进行插值的原则性方法。我们成功地将这种方法应用于许多决策和分类问题,以及分析模拟脑成像数据。具有鲁棒先验的模型具有优异的最坏情况性能。解决方案遵循启发式的形式,用于推导先验。这些新算法可以应用于数据分析和机器学习,也可以帮助理解人们如何从新手过渡到专家。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Cognitive Psychology
Cognitive Psychology 医学-心理学
CiteScore
5.40
自引率
3.80%
发文量
29
审稿时长
50 days
期刊介绍: Cognitive Psychology is concerned with advances in the study of attention, memory, language processing, perception, problem solving, and thinking. Cognitive Psychology specializes in extensive articles that have a major impact on cognitive theory and provide new theoretical advances. Research Areas include: • Artificial intelligence • Developmental psychology • Linguistics • Neurophysiology • Social psychology.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信