M. Yoosuf, C. Muralidharan, S. Shitharth, Mohammed Alghamdi, Mohammed Maray, O. Rabie
{"title":"FogDedupe:使用多密钥同态加密技术的以雾为中心的重复数据删除方法","authors":"M. Yoosuf, C. Muralidharan, S. Shitharth, Mohammed Alghamdi, Mohammed Maray, O. Rabie","doi":"10.1155/2022/6759875","DOIUrl":null,"url":null,"abstract":"The advancements in communication technologies and a rapid increase in the usage of IoT devices have resulted in an increased data generation rate. Storing, managing, and processing large quantities of unstructured data generated by IoT devices remain a huge challenge to cloud service providers (CSP). To reduce the storage overhead, CSPs implement deduplication algorithms on the cloud storage servers. It identifies and eliminates the redundant data blocks. However, implementing post-progress deduplication schemes does not address the bandwidth issues. Also, existing convergent key-based deduplication schemes are highly vulnerable to confirmation of file attacks (CFA) and can leak confidential information. To overcome these issues, FogDedupe, a fog-centric deduplication framework, is proposed. It performs source-level deduplication on the fog nodes to reduce the bandwidth usage and post-progress deduplication to improve the cloud storage efficiency. To perform source-level deduplication, a distributed index table is created and maintained in the fog nodes, and post-progress deduplication is performed using a multi-key homomorphic encryption technique. To evaluate the proposed FogDedupe framework, a testbed environment is created using the open-source Eucalyptus v.4.2.0 software and fog project v1.5.9 package. The proposed scheme tightens the security against CFA attacks and improves the storage overhead by 27% and reduces the deduplication latency by 12%.","PeriodicalId":14776,"journal":{"name":"J. Sensors","volume":"52 1","pages":"1-16"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"FogDedupe: A Fog-Centric Deduplication Approach Using Multi-Key Homomorphic Encryption Technique\",\"authors\":\"M. Yoosuf, C. Muralidharan, S. Shitharth, Mohammed Alghamdi, Mohammed Maray, O. Rabie\",\"doi\":\"10.1155/2022/6759875\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The advancements in communication technologies and a rapid increase in the usage of IoT devices have resulted in an increased data generation rate. Storing, managing, and processing large quantities of unstructured data generated by IoT devices remain a huge challenge to cloud service providers (CSP). To reduce the storage overhead, CSPs implement deduplication algorithms on the cloud storage servers. It identifies and eliminates the redundant data blocks. However, implementing post-progress deduplication schemes does not address the bandwidth issues. Also, existing convergent key-based deduplication schemes are highly vulnerable to confirmation of file attacks (CFA) and can leak confidential information. To overcome these issues, FogDedupe, a fog-centric deduplication framework, is proposed. It performs source-level deduplication on the fog nodes to reduce the bandwidth usage and post-progress deduplication to improve the cloud storage efficiency. To perform source-level deduplication, a distributed index table is created and maintained in the fog nodes, and post-progress deduplication is performed using a multi-key homomorphic encryption technique. To evaluate the proposed FogDedupe framework, a testbed environment is created using the open-source Eucalyptus v.4.2.0 software and fog project v1.5.9 package. The proposed scheme tightens the security against CFA attacks and improves the storage overhead by 27% and reduces the deduplication latency by 12%.\",\"PeriodicalId\":14776,\"journal\":{\"name\":\"J. Sensors\",\"volume\":\"52 1\",\"pages\":\"1-16\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-08-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"J. Sensors\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1155/2022/6759875\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"J. Sensors","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1155/2022/6759875","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
摘要
通信技术的进步和物联网设备使用的迅速增加导致了数据生成速率的提高。存储、管理和处理物联网设备生成的大量非结构化数据仍然是云服务提供商(CSP)面临的巨大挑战。为了减少存储开销,云计算服务提供商(csp)在云存储服务器上实现重复数据删除算法。它识别并消除冗余的数据块。但是,实现进度后重复数据删除方案并不能解决带宽问题。同时,现有的基于融合密钥的重复数据删除方案极易受到文件确认攻击(confirmation of file attacks, CFA)的攻击,可能会泄露机密信息。为了克服这些问题,提出了FogDedupe,一个以雾为中心的重复数据删除框架。对雾节点进行源级重复数据删除,降低带宽占用,并进行进度重复数据删除,提高云存储效率。为了执行源级重复数据删除,在雾节点中创建和维护分布式索引表,并使用多密钥同态加密技术执行进度重复数据删除。为了评估建议的FogDedupe框架,使用开源的Eucalyptus v.4.2.0软件和fog项目v1.5.9包创建了一个测试平台环境。该方案加强了对CFA攻击的安全性,将存储开销提高了27%,将重复数据删除延迟降低了12%。
FogDedupe: A Fog-Centric Deduplication Approach Using Multi-Key Homomorphic Encryption Technique
The advancements in communication technologies and a rapid increase in the usage of IoT devices have resulted in an increased data generation rate. Storing, managing, and processing large quantities of unstructured data generated by IoT devices remain a huge challenge to cloud service providers (CSP). To reduce the storage overhead, CSPs implement deduplication algorithms on the cloud storage servers. It identifies and eliminates the redundant data blocks. However, implementing post-progress deduplication schemes does not address the bandwidth issues. Also, existing convergent key-based deduplication schemes are highly vulnerable to confirmation of file attacks (CFA) and can leak confidential information. To overcome these issues, FogDedupe, a fog-centric deduplication framework, is proposed. It performs source-level deduplication on the fog nodes to reduce the bandwidth usage and post-progress deduplication to improve the cloud storage efficiency. To perform source-level deduplication, a distributed index table is created and maintained in the fog nodes, and post-progress deduplication is performed using a multi-key homomorphic encryption technique. To evaluate the proposed FogDedupe framework, a testbed environment is created using the open-source Eucalyptus v.4.2.0 software and fog project v1.5.9 package. The proposed scheme tightens the security against CFA attacks and improves the storage overhead by 27% and reduces the deduplication latency by 12%.