Dimensionality Reduction Procedure for Bigdata in Machine Learning Techniques

K. U. Kiran, D. Srikanth, P. Nair, S. Hasane Ahammad, K. Saikumar
{"title":"Dimensionality Reduction Procedure for Bigdata in Machine Learning Techniques","authors":"K. U. Kiran, D. Srikanth, P. Nair, S. Hasane Ahammad, K. Saikumar","doi":"10.1109/ICCMC53470.2022.9754014","DOIUrl":null,"url":null,"abstract":"In the present field of software applications, the prominently employed parameters for parameters control are the kinds of models such as cloud computing, machine learning, and big data analytics. So, in the current scenario, these are in high demand and are on-line with the trends for future decades as well. Nevertheless, as mentioned earlier, these models can access very low data and process speed. It is well known that the storage equipment’s for day-to-day monitoring serves at a higher cost and has hardware complexity, further leading towards rapid increment in dimensionality. Therefore, for the higher rate of dimensional data, the optimization approach of any variety would consume time to a greater extent. The concern issues are mostly related to the dimensionality with high data space instead of the low data space. A dimensional dropped approach is proposed in this paper in combinational with the Logistic regression (L.R.) version. The proposed technique is well known and applicable for the problems of clustering and dimension reduction. The size of the dimensional data to the LRML method has diminished, and the efficiency achieved at the rate of 95.5% and the reduction ratio is 34.89%.","PeriodicalId":345346,"journal":{"name":"2022 6th International Conference on Computing Methodologies and Communication (ICCMC)","volume":"128 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 6th International Conference on Computing Methodologies and Communication (ICCMC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCMC53470.2022.9754014","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

In the present field of software applications, the prominently employed parameters for parameters control are the kinds of models such as cloud computing, machine learning, and big data analytics. So, in the current scenario, these are in high demand and are on-line with the trends for future decades as well. Nevertheless, as mentioned earlier, these models can access very low data and process speed. It is well known that the storage equipment’s for day-to-day monitoring serves at a higher cost and has hardware complexity, further leading towards rapid increment in dimensionality. Therefore, for the higher rate of dimensional data, the optimization approach of any variety would consume time to a greater extent. The concern issues are mostly related to the dimensionality with high data space instead of the low data space. A dimensional dropped approach is proposed in this paper in combinational with the Logistic regression (L.R.) version. The proposed technique is well known and applicable for the problems of clustering and dimension reduction. The size of the dimensional data to the LRML method has diminished, and the efficiency achieved at the rate of 95.5% and the reduction ratio is 34.89%.
机器学习技术中大数据的降维过程
在当前的软件应用领域中,参数控制最常用的是云计算、机器学习、大数据分析等模型。所以,在目前的情况下,这些都是高需求的,并且与未来几十年的趋势保持一致。然而,如前所述,这些模型可以访问非常低的数据和处理速度。众所周知,用于日常监控的存储设备成本较高,硬件复杂,进一步导致维数的快速增长。因此,对于更高的维度数据率,任何一种优化方法都会消耗更大的时间。关注的问题大多与高数据空间的维数有关,而不是低数据空间的维数。本文结合逻辑回归(L.R.)版本,提出了一种降维方法。该方法被广泛应用于聚类和降维问题。LRML方法的维数数据减小,效率为95.5%,约简率为34.89%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信