{"title":"随机森林和神经网络的莫比乌斯函数算法","authors":"Huan Qin, Yangbo Ye","doi":"10.1186/s40537-024-00889-7","DOIUrl":null,"url":null,"abstract":"<p>The Möbius function <span>\\(\\mu (n)\\)</span> is known for containing limited information on the prime factorization of <i>n</i>. Its known algorithms, however, are all based on factorization and hence are exponentially slow on <span>\\(\\log n\\)</span>. Consequently, a faster algorithm of <span>\\(\\mu (n)\\)</span> could potentially lead to a fast algorithm of prime factorization which in turn would throw doubt upon the security of most public-key cryptosystems. This research introduces novel approaches to compute <span>\\(\\mu (n)\\)</span> using random forests and neural networks, harnessing the additive properties of <span>\\(\\mu (n)\\)</span>. The machine learning models are trained on a substantial dataset with 317,284 observations (80%), comprising five feature variables, including values of <i>n</i> within the range of <span>\\(4\\times 10^9\\)</span>. We implement the Random Forest with Random Inputs (RFRI) and Feedforward Neural Network (FNN) architectures. The RFRI model achieves a predictive accuracy of 0.9493, a recall of 0.5865, and a precision of 0.6626. On the other hand, the FNN model attains a predictive accuracy of 0.7871, a recall of 0.9477, and a precision of 0.2784. These results strongly support the effectiveness and validity of the proposed algorithms.</p>","PeriodicalId":15158,"journal":{"name":"Journal of Big Data","volume":"93 1","pages":""},"PeriodicalIF":8.6000,"publicationDate":"2024-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Algorithms of the Möbius function by random forests and neural networks\",\"authors\":\"Huan Qin, Yangbo Ye\",\"doi\":\"10.1186/s40537-024-00889-7\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>The Möbius function <span>\\\\(\\\\mu (n)\\\\)</span> is known for containing limited information on the prime factorization of <i>n</i>. Its known algorithms, however, are all based on factorization and hence are exponentially slow on <span>\\\\(\\\\log n\\\\)</span>. Consequently, a faster algorithm of <span>\\\\(\\\\mu (n)\\\\)</span> could potentially lead to a fast algorithm of prime factorization which in turn would throw doubt upon the security of most public-key cryptosystems. This research introduces novel approaches to compute <span>\\\\(\\\\mu (n)\\\\)</span> using random forests and neural networks, harnessing the additive properties of <span>\\\\(\\\\mu (n)\\\\)</span>. The machine learning models are trained on a substantial dataset with 317,284 observations (80%), comprising five feature variables, including values of <i>n</i> within the range of <span>\\\\(4\\\\times 10^9\\\\)</span>. We implement the Random Forest with Random Inputs (RFRI) and Feedforward Neural Network (FNN) architectures. The RFRI model achieves a predictive accuracy of 0.9493, a recall of 0.5865, and a precision of 0.6626. On the other hand, the FNN model attains a predictive accuracy of 0.7871, a recall of 0.9477, and a precision of 0.2784. These results strongly support the effectiveness and validity of the proposed algorithms.</p>\",\"PeriodicalId\":15158,\"journal\":{\"name\":\"Journal of Big Data\",\"volume\":\"93 1\",\"pages\":\"\"},\"PeriodicalIF\":8.6000,\"publicationDate\":\"2024-02-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Big Data\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1186/s40537-024-00889-7\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, THEORY & METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Big Data","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1186/s40537-024-00889-7","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
Algorithms of the Möbius function by random forests and neural networks
The Möbius function \(\mu (n)\) is known for containing limited information on the prime factorization of n. Its known algorithms, however, are all based on factorization and hence are exponentially slow on \(\log n\). Consequently, a faster algorithm of \(\mu (n)\) could potentially lead to a fast algorithm of prime factorization which in turn would throw doubt upon the security of most public-key cryptosystems. This research introduces novel approaches to compute \(\mu (n)\) using random forests and neural networks, harnessing the additive properties of \(\mu (n)\). The machine learning models are trained on a substantial dataset with 317,284 observations (80%), comprising five feature variables, including values of n within the range of \(4\times 10^9\). We implement the Random Forest with Random Inputs (RFRI) and Feedforward Neural Network (FNN) architectures. The RFRI model achieves a predictive accuracy of 0.9493, a recall of 0.5865, and a precision of 0.6626. On the other hand, the FNN model attains a predictive accuracy of 0.7871, a recall of 0.9477, and a precision of 0.2784. These results strongly support the effectiveness and validity of the proposed algorithms.
期刊介绍:
The Journal of Big Data publishes high-quality, scholarly research papers, methodologies, and case studies covering a broad spectrum of topics, from big data analytics to data-intensive computing and all applications of big data research. It addresses challenges facing big data today and in the future, including data capture and storage, search, sharing, analytics, technologies, visualization, architectures, data mining, machine learning, cloud computing, distributed systems, and scalable storage. The journal serves as a seminal source of innovative material for academic researchers and practitioners alike.