Marcell Szabó , Ákos Recse , Róbert Szabó , Dávid Balla , Markosz Maliosz
{"title":"Separation and optimization of encryption and erasure coding in decentralized storage systems","authors":"Marcell Szabó , Ákos Recse , Róbert Szabó , Dávid Balla , Markosz Maliosz","doi":"10.1016/j.future.2025.107739","DOIUrl":null,"url":null,"abstract":"<div><div>Entering the cloud storage market requires a high upfront investment, thus it is dominated by a few players with existing capacity. Decentralized cloud storage solutions can disrupt the status quo by allowing businesses and individuals to sell their unused storage capacity, reducing the need for large upfront investments in service infrastructure. We show that network operators providing such service can significantly decrease the traffic volume carried on the transport network, which is essential when serving mobile users, while maintaining high data security by implementing our proposed solution, of leveraging controlled replication inside the core network. Upon data uploads encryption and erasure encoding are separated, with the latter moved inside the network, enabling the arbitrary replication of storable data pieces without straining the access network. We present simulation results, showing that the proposed method reduces traffic by 20% compared to the out-of-the-box solution. Moreover, we elaborate on optimal multi-proxy placements and even optimal storage node choosings in complex ISP networks, where deep data penetration is desired, by giving ILP optimization methods and results, achieving minimal overall network load and maximum data security.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"167 ","pages":"Article 107739"},"PeriodicalIF":6.2000,"publicationDate":"2025-02-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Future Generation Computer Systems-The International Journal of Escience","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167739X25000342","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0
Abstract
Entering the cloud storage market requires a high upfront investment, thus it is dominated by a few players with existing capacity. Decentralized cloud storage solutions can disrupt the status quo by allowing businesses and individuals to sell their unused storage capacity, reducing the need for large upfront investments in service infrastructure. We show that network operators providing such service can significantly decrease the traffic volume carried on the transport network, which is essential when serving mobile users, while maintaining high data security by implementing our proposed solution, of leveraging controlled replication inside the core network. Upon data uploads encryption and erasure encoding are separated, with the latter moved inside the network, enabling the arbitrary replication of storable data pieces without straining the access network. We present simulation results, showing that the proposed method reduces traffic by 20% compared to the out-of-the-box solution. Moreover, we elaborate on optimal multi-proxy placements and even optimal storage node choosings in complex ISP networks, where deep data penetration is desired, by giving ILP optimization methods and results, achieving minimal overall network load and maximum data security.
期刊介绍:
Computing infrastructures and systems are constantly evolving, resulting in increasingly complex and collaborative scientific applications. To cope with these advancements, there is a growing need for collaborative tools that can effectively map, control, and execute these applications.
Furthermore, with the explosion of Big Data, there is a requirement for innovative methods and infrastructures to collect, analyze, and derive meaningful insights from the vast amount of data generated. This necessitates the integration of computational and storage capabilities, databases, sensors, and human collaboration.
Future Generation Computer Systems aims to pioneer advancements in distributed systems, collaborative environments, high-performance computing, and Big Data analytics. It strives to stay at the forefront of developments in grids, clouds, and the Internet of Things (IoT) to effectively address the challenges posed by these wide-area, fully distributed sensing and computing systems.