Bayesian regularization in multiple-indicators multiple-causes models.

IF 7.6 1区 心理学 Q1 PSYCHOLOGY, MULTIDISCIPLINARY
Psychological methods Pub Date : 2024-08-01 Epub Date: 2023-07-27 DOI:10.1037/met0000594
Lijin Zhang, Xinya Liang
{"title":"Bayesian regularization in multiple-indicators multiple-causes models.","authors":"Lijin Zhang, Xinya Liang","doi":"10.1037/met0000594","DOIUrl":null,"url":null,"abstract":"<p><p>Integrating regularization methods into structural equation modeling is gaining increasing popularity. The purpose of regularization is to improve variable selection, model estimation, and prediction accuracy. In this study, we aim to: (a) compare Bayesian regularization methods for exploring covariate effects in multiple-indicators multiple-causes models, (b) examine the sensitivity of results to hyperparameter settings of penalty priors, and (c) investigate prediction accuracy through cross-validation. The Bayesian regularization methods examined included: ridge, lasso, adaptive lasso, spike-and-slab prior (SSP) and its variants, and horseshoe and its variants. Sparse solutions were developed for the structural coefficient matrix that contained only a small portion of nonzero path coefficients characterizing the effects of selected covariates on the latent variable. Results from the simulation study showed that compared to diffuse priors, penalty priors were advantageous in handling small sample sizes and collinearity among covariates. Priors with only the global penalty (ridge and lasso) yielded higher model convergence rates and power, whereas priors with both the global and local penalties (horseshoe and SSP) provided more accurate parameter estimates for medium and large covariate effects. The horseshoe and SSP improved accuracy in predicting factor scores, while achieving more parsimonious models. (PsycInfo Database Record (c) 2024 APA, all rights reserved).</p>","PeriodicalId":20782,"journal":{"name":"Psychological methods","volume":" ","pages":"679-703"},"PeriodicalIF":7.6000,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Psychological methods","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1037/met0000594","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/7/27 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"PSYCHOLOGY, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

Integrating regularization methods into structural equation modeling is gaining increasing popularity. The purpose of regularization is to improve variable selection, model estimation, and prediction accuracy. In this study, we aim to: (a) compare Bayesian regularization methods for exploring covariate effects in multiple-indicators multiple-causes models, (b) examine the sensitivity of results to hyperparameter settings of penalty priors, and (c) investigate prediction accuracy through cross-validation. The Bayesian regularization methods examined included: ridge, lasso, adaptive lasso, spike-and-slab prior (SSP) and its variants, and horseshoe and its variants. Sparse solutions were developed for the structural coefficient matrix that contained only a small portion of nonzero path coefficients characterizing the effects of selected covariates on the latent variable. Results from the simulation study showed that compared to diffuse priors, penalty priors were advantageous in handling small sample sizes and collinearity among covariates. Priors with only the global penalty (ridge and lasso) yielded higher model convergence rates and power, whereas priors with both the global and local penalties (horseshoe and SSP) provided more accurate parameter estimates for medium and large covariate effects. The horseshoe and SSP improved accuracy in predicting factor scores, while achieving more parsimonious models. (PsycInfo Database Record (c) 2024 APA, all rights reserved).

多指标多原因模型中的贝叶斯正则化。
将正则化方法集成到结构方程建模中越来越受欢迎。正则化的目的是提高变量选择、模型估计和预测精度。在这项研究中,我们的目标是:(a)比较贝叶斯正则化方法来探索多指标多原因模型中的协变量效应,(b)检验结果对惩罚先验超参数设置的敏感性,以及(c)通过交叉验证来研究预测的准确性。研究的贝叶斯正则化方法包括:脊法、套索法、自适应套索法、钉板先验(SSP)及其变体、马蹄法及其变体。我们为结构系数矩阵开发了稀疏解,该矩阵只包含一小部分表征选定协变量对潜在变量影响的非零路径系数。仿真研究结果表明,与扩散先验相比,惩罚先验在处理小样本量和协变量间共线性方面具有优势。只有全局惩罚的先验(ridge和lasso)产生了更高的模型收敛率和功率,而同时具有全局和局部惩罚的先验(horseshoe和SSP)为中、大协变量效应提供了更准确的参数估计。马蹄形和SSP提高了预测因子得分的准确性,同时实现了更简洁的模型。(PsycInfo数据库记录(c) 2023 APA,版权所有)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Psychological methods
Psychological methods PSYCHOLOGY, MULTIDISCIPLINARY-
CiteScore
13.10
自引率
7.10%
发文量
159
期刊介绍: Psychological Methods is devoted to the development and dissemination of methods for collecting, analyzing, understanding, and interpreting psychological data. Its purpose is the dissemination of innovations in research design, measurement, methodology, and quantitative and qualitative analysis to the psychological community; its further purpose is to promote effective communication about related substantive and methodological issues. The audience is expected to be diverse and to include those who develop new procedures, those who are responsible for undergraduate and graduate training in design, measurement, and statistics, as well as those who employ those procedures in research.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信