随机森林和神经网络的莫比乌斯函数算法

IF 8.6 2区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS
Huan Qin, Yangbo Ye
{"title":"随机森林和神经网络的莫比乌斯函数算法","authors":"Huan Qin, Yangbo Ye","doi":"10.1186/s40537-024-00889-7","DOIUrl":null,"url":null,"abstract":"<p>The Möbius function <span>\\(\\mu (n)\\)</span> is known for containing limited information on the prime factorization of <i>n</i>. Its known algorithms, however, are all based on factorization and hence are exponentially slow on <span>\\(\\log n\\)</span>. Consequently, a faster algorithm of <span>\\(\\mu (n)\\)</span> could potentially lead to a fast algorithm of prime factorization which in turn would throw doubt upon the security of most public-key cryptosystems. This research introduces novel approaches to compute <span>\\(\\mu (n)\\)</span> using random forests and neural networks, harnessing the additive properties of <span>\\(\\mu (n)\\)</span>. The machine learning models are trained on a substantial dataset with 317,284 observations (80%), comprising five feature variables, including values of <i>n</i> within the range of <span>\\(4\\times 10^9\\)</span>. We implement the Random Forest with Random Inputs (RFRI) and Feedforward Neural Network (FNN) architectures. The RFRI model achieves a predictive accuracy of 0.9493, a recall of 0.5865, and a precision of 0.6626. On the other hand, the FNN model attains a predictive accuracy of 0.7871, a recall of 0.9477, and a precision of 0.2784. These results strongly support the effectiveness and validity of the proposed algorithms.</p>","PeriodicalId":15158,"journal":{"name":"Journal of Big Data","volume":"93 1","pages":""},"PeriodicalIF":8.6000,"publicationDate":"2024-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Algorithms of the Möbius function by random forests and neural networks\",\"authors\":\"Huan Qin, Yangbo Ye\",\"doi\":\"10.1186/s40537-024-00889-7\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>The Möbius function <span>\\\\(\\\\mu (n)\\\\)</span> is known for containing limited information on the prime factorization of <i>n</i>. Its known algorithms, however, are all based on factorization and hence are exponentially slow on <span>\\\\(\\\\log n\\\\)</span>. Consequently, a faster algorithm of <span>\\\\(\\\\mu (n)\\\\)</span> could potentially lead to a fast algorithm of prime factorization which in turn would throw doubt upon the security of most public-key cryptosystems. This research introduces novel approaches to compute <span>\\\\(\\\\mu (n)\\\\)</span> using random forests and neural networks, harnessing the additive properties of <span>\\\\(\\\\mu (n)\\\\)</span>. The machine learning models are trained on a substantial dataset with 317,284 observations (80%), comprising five feature variables, including values of <i>n</i> within the range of <span>\\\\(4\\\\times 10^9\\\\)</span>. We implement the Random Forest with Random Inputs (RFRI) and Feedforward Neural Network (FNN) architectures. The RFRI model achieves a predictive accuracy of 0.9493, a recall of 0.5865, and a precision of 0.6626. On the other hand, the FNN model attains a predictive accuracy of 0.7871, a recall of 0.9477, and a precision of 0.2784. These results strongly support the effectiveness and validity of the proposed algorithms.</p>\",\"PeriodicalId\":15158,\"journal\":{\"name\":\"Journal of Big Data\",\"volume\":\"93 1\",\"pages\":\"\"},\"PeriodicalIF\":8.6000,\"publicationDate\":\"2024-02-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Big Data\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1186/s40537-024-00889-7\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, THEORY & METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Big Data","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1186/s40537-024-00889-7","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0

摘要

众所周知,莫比乌斯函数(\mu (n)\)包含了关于 n 的素因式分解的有限信息。然而,它的已知算法都是基于因式分解的,因此在 \(\log n\) 上是指数级的慢。因此,一个更快的 \(\mu (n)\) 算法有可能导致一个快速的素因式分解算法,这反过来又会使大多数公钥密码系统的安全性受到质疑。这项研究介绍了利用随机森林和神经网络计算(\mu (n)\)的新方法,利用了(\mu (n)\)的加法特性。机器学习模型是在一个包含 317 284 个观测值(80%)的大型数据集上训练的,该数据集包含五个特征变量,其中 n 的取值范围为 (4\times 10^9\)。我们采用了随机输入随机森林(RFRI)和前馈神经网络(FNN)架构。RFRI 模型的预测准确率为 0.9493,召回率为 0.5865,精度为 0.6626。另一方面,FNN 模型的预测准确率为 0.7871,召回率为 0.9477,精度为 0.2784。这些结果有力地证明了拟议算法的有效性和正确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Algorithms of the Möbius function by random forests and neural networks

Algorithms of the Möbius function by random forests and neural networks

The Möbius function \(\mu (n)\) is known for containing limited information on the prime factorization of n. Its known algorithms, however, are all based on factorization and hence are exponentially slow on \(\log n\). Consequently, a faster algorithm of \(\mu (n)\) could potentially lead to a fast algorithm of prime factorization which in turn would throw doubt upon the security of most public-key cryptosystems. This research introduces novel approaches to compute \(\mu (n)\) using random forests and neural networks, harnessing the additive properties of \(\mu (n)\). The machine learning models are trained on a substantial dataset with 317,284 observations (80%), comprising five feature variables, including values of n within the range of \(4\times 10^9\). We implement the Random Forest with Random Inputs (RFRI) and Feedforward Neural Network (FNN) architectures. The RFRI model achieves a predictive accuracy of 0.9493, a recall of 0.5865, and a precision of 0.6626. On the other hand, the FNN model attains a predictive accuracy of 0.7871, a recall of 0.9477, and a precision of 0.2784. These results strongly support the effectiveness and validity of the proposed algorithms.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Big Data
Journal of Big Data Computer Science-Information Systems
CiteScore
17.80
自引率
3.70%
发文量
105
审稿时长
13 weeks
期刊介绍: The Journal of Big Data publishes high-quality, scholarly research papers, methodologies, and case studies covering a broad spectrum of topics, from big data analytics to data-intensive computing and all applications of big data research. It addresses challenges facing big data today and in the future, including data capture and storage, search, sharing, analytics, technologies, visualization, architectures, data mining, machine learning, cloud computing, distributed systems, and scalable storage. The journal serves as a seminal source of innovative material for academic researchers and practitioners alike.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信