Modified Regularization for High-dimensional Data Decomposition

Sheng Chai, W. Feng, Hossam S. Hassanein
{"title":"Modified Regularization for High-dimensional Data Decomposition","authors":"Sheng Chai, W. Feng, Hossam S. Hassanein","doi":"10.1109/WI-IAT55865.2022.00113","DOIUrl":null,"url":null,"abstract":"With the increased dimensionality of datasets, high-dimensional data decomposition models have become essential data analysis tools. However, the decomposition method usually suffers from the overfitting problem and, consequently, cannot achieve state-of-the-art performance. This motivates the introduction of various regularization terms. The commonly applied Ridge regression has limited applicability for the asperity dataset and reduces performance for sparse data, while the Lasso regression has higher efficiency in the sparse dataset. To address this challenge, we propose a modified regularization term designed by integrating both the Lasso and Ridge regressions. The different roles of these two regressions are analyzed. By adjusting the weights of the regression in the regularization term, the existing decomposition method can be applied to the dataset with different degrees of sparsity. The experiments show that the modified regularization term yields consistent improvement in the performance of existing benchmarks.","PeriodicalId":345445,"journal":{"name":"2022 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WI-IAT55865.2022.00113","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

With the increased dimensionality of datasets, high-dimensional data decomposition models have become essential data analysis tools. However, the decomposition method usually suffers from the overfitting problem and, consequently, cannot achieve state-of-the-art performance. This motivates the introduction of various regularization terms. The commonly applied Ridge regression has limited applicability for the asperity dataset and reduces performance for sparse data, while the Lasso regression has higher efficiency in the sparse dataset. To address this challenge, we propose a modified regularization term designed by integrating both the Lasso and Ridge regressions. The different roles of these two regressions are analyzed. By adjusting the weights of the regression in the regularization term, the existing decomposition method can be applied to the dataset with different degrees of sparsity. The experiments show that the modified regularization term yields consistent improvement in the performance of existing benchmarks.
高维数据分解的改进正则化
随着数据集维数的不断增加,高维数据分解模型已成为必不可少的数据分析工具。然而,分解方法通常存在过拟合问题,因此无法达到最先进的性能。这促使引入各种正则化术语。通常应用的Ridge回归对粗糙数据的适用性有限,并且降低了稀疏数据的性能,而Lasso回归在稀疏数据上具有更高的效率。为了解决这一挑战,我们提出了一个改进的正则化项,通过整合Lasso和Ridge回归来设计。分析了这两种回归的不同作用。通过调整正则化项中回归的权重,现有的分解方法可以应用于不同稀疏度的数据集。实验表明,改进后的正则化项与现有基准的性能有一致的提高。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信