Jabin Prakash J, Ramesh K, Saravanan K, Lakshmi Prabha G
{"title":"基于区块链的数据重复删除,在云环境中使用新颖的内容定义分块算法","authors":"Jabin Prakash J, Ramesh K, Saravanan K, Lakshmi Prabha G","doi":"10.1002/nem.2249","DOIUrl":null,"url":null,"abstract":"<p>The cloud environment is inherently dynamic as users are added immensely in a short duration. It is indeed difficult to manage such user profiles and associated data. Meanwhile, the cloud data expand at a twofold-to-threefold rate on average, making storage space management and data integrity maintenance a mandatory task but also risky. The main approaches for addressing these data maintenance challenges in a cloud context are deduplication and data protection. In order to manage storage space, finding and removing identical copies of the same data from the cloud are possible, resulting in a reduction in the amount of storage space needed. Furthermore, duplicate copies are considerably reduced in cloud storage owing to data deduplication. Here, a decentralized ledger public blockchain network is introduced to protect the Integrity of data stored in cloud storage. This research proposes data deduplication using speedy content-defined Chunking (SpeedyCDC) algorithm in the public blockchain. Many people and businesses outsource sensitive data to remote cloud servers because it considerably eliminates the hassle of managing software and infrastructure. However, the ownership and control rights of users data are nonetheless divided because it is outsourced to cloud storage providers (CSPs) and kept on a distant cloud. As a result, users have a great deal of difficulty in verifying the Integrity of sensitive data. Analysis using datasets from Geospatial Information Systems (GIS) revealed that the throughput increased by 5%–6% over that of the fastCDC technique, which offered Integrity since a blockchain network secured it.</p>","PeriodicalId":14154,"journal":{"name":"International Journal of Network Management","volume":"33 6","pages":""},"PeriodicalIF":1.5000,"publicationDate":"2023-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Blockchain-based data deduplication using novel content-defined chunking algorithm in cloud environment\",\"authors\":\"Jabin Prakash J, Ramesh K, Saravanan K, Lakshmi Prabha G\",\"doi\":\"10.1002/nem.2249\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>The cloud environment is inherently dynamic as users are added immensely in a short duration. It is indeed difficult to manage such user profiles and associated data. Meanwhile, the cloud data expand at a twofold-to-threefold rate on average, making storage space management and data integrity maintenance a mandatory task but also risky. The main approaches for addressing these data maintenance challenges in a cloud context are deduplication and data protection. In order to manage storage space, finding and removing identical copies of the same data from the cloud are possible, resulting in a reduction in the amount of storage space needed. Furthermore, duplicate copies are considerably reduced in cloud storage owing to data deduplication. Here, a decentralized ledger public blockchain network is introduced to protect the Integrity of data stored in cloud storage. This research proposes data deduplication using speedy content-defined Chunking (SpeedyCDC) algorithm in the public blockchain. Many people and businesses outsource sensitive data to remote cloud servers because it considerably eliminates the hassle of managing software and infrastructure. However, the ownership and control rights of users data are nonetheless divided because it is outsourced to cloud storage providers (CSPs) and kept on a distant cloud. As a result, users have a great deal of difficulty in verifying the Integrity of sensitive data. Analysis using datasets from Geospatial Information Systems (GIS) revealed that the throughput increased by 5%–6% over that of the fastCDC technique, which offered Integrity since a blockchain network secured it.</p>\",\"PeriodicalId\":14154,\"journal\":{\"name\":\"International Journal of Network Management\",\"volume\":\"33 6\",\"pages\":\"\"},\"PeriodicalIF\":1.5000,\"publicationDate\":\"2023-08-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Network Management\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/nem.2249\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Network Management","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/nem.2249","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Blockchain-based data deduplication using novel content-defined chunking algorithm in cloud environment
The cloud environment is inherently dynamic as users are added immensely in a short duration. It is indeed difficult to manage such user profiles and associated data. Meanwhile, the cloud data expand at a twofold-to-threefold rate on average, making storage space management and data integrity maintenance a mandatory task but also risky. The main approaches for addressing these data maintenance challenges in a cloud context are deduplication and data protection. In order to manage storage space, finding and removing identical copies of the same data from the cloud are possible, resulting in a reduction in the amount of storage space needed. Furthermore, duplicate copies are considerably reduced in cloud storage owing to data deduplication. Here, a decentralized ledger public blockchain network is introduced to protect the Integrity of data stored in cloud storage. This research proposes data deduplication using speedy content-defined Chunking (SpeedyCDC) algorithm in the public blockchain. Many people and businesses outsource sensitive data to remote cloud servers because it considerably eliminates the hassle of managing software and infrastructure. However, the ownership and control rights of users data are nonetheless divided because it is outsourced to cloud storage providers (CSPs) and kept on a distant cloud. As a result, users have a great deal of difficulty in verifying the Integrity of sensitive data. Analysis using datasets from Geospatial Information Systems (GIS) revealed that the throughput increased by 5%–6% over that of the fastCDC technique, which offered Integrity since a blockchain network secured it.
期刊介绍:
Modern computer networks and communication systems are increasing in size, scope, and heterogeneity. The promise of a single end-to-end technology has not been realized and likely never will occur. The decreasing cost of bandwidth is increasing the possible applications of computer networks and communication systems to entirely new domains. Problems in integrating heterogeneous wired and wireless technologies, ensuring security and quality of service, and reliably operating large-scale systems including the inclusion of cloud computing have all emerged as important topics. The one constant is the need for network management. Challenges in network management have never been greater than they are today. The International Journal of Network Management is the forum for researchers, developers, and practitioners in network management to present their work to an international audience. The journal is dedicated to the dissemination of information, which will enable improved management, operation, and maintenance of computer networks and communication systems. The journal is peer reviewed and publishes original papers (both theoretical and experimental) by leading researchers, practitioners, and consultants from universities, research laboratories, and companies around the world. Issues with thematic or guest-edited special topics typically occur several times per year. Topic areas for the journal are largely defined by the taxonomy for network and service management developed by IFIP WG6.6, together with IEEE-CNOM, the IRTF-NMRG and the Emanics Network of Excellence.