{"title":"Provably secure authentication protocol for traffic exchanges in unmanned aerial vehicles","authors":"Vincent Omollo Nyangaresi","doi":"10.1016/j.hcc.2023.100154","DOIUrl":"10.1016/j.hcc.2023.100154","url":null,"abstract":"<div><p>Unmanned aerial vehicles offer services such as military reconnaissance in potentially adversarial controlled regions. In addition, they have been deployed in civilian critical infrastructure monitoring. In this environment, real-time and massive data is exchanged between the aerial vehicles and the ground control stations. Depending on the mission of these aerial vehicles, some of the collected and transmitted data is sensitive and private. Therefore, many security protocols have been presented to offer privacy and security protection. However, majority of these schemes fail to consider attack vectors such as side-channeling, de-synchronization and known secret session temporary information leakages. This last attack can be launched upon adversarial physical capture of these drones. In addition, some of these protocols deploy computationally intensive asymmetric cryptographic primitives that result in high overheads. In this paper, an authentication protocol based on lightweight quadratic residues and hash functions is developed. Its formal security analysis is executed using the widely deployed random oracle model. In addition, informal security analysis is carried out to show its robustness under the Dolev–Yao (DY) and Canetti–Krawczyk (CK) threat models. In terms of operational efficiency, it is shown to have relatively lower execution time, communication costs, and incurs the least storage costs among other related protocols. Specifically, the proposed protocol provides a 25% improvement in supported security and privacy features and a 6.52% reduction in storage costs. In overall, the proposed methodology offers strong security and privacy protection at lower execution time, storage and communication overheads.</p></div>","PeriodicalId":100605,"journal":{"name":"High-Confidence Computing","volume":"3 4","pages":"Article 100154"},"PeriodicalIF":0.0,"publicationDate":"2023-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2667295223000521/pdfft?md5=905b3445e9516ad8c201c868fb43d5f4&pid=1-s2.0-S2667295223000521-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135346685","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Knowledge-based recommendation with contrastive learning","authors":"Yang He , Xu Zheng , Rui Xu , Ling Tian","doi":"10.1016/j.hcc.2023.100151","DOIUrl":"https://doi.org/10.1016/j.hcc.2023.100151","url":null,"abstract":"<div><p>Knowledge Graphs (KGs) have been incorporated as external information into recommendation systems to ensure the high-confidence system. Recently, Contrastive Learning (CL) framework has been widely used in knowledge-based recommendation, owing to the ability to mitigate data sparsity and it considers the expandable computing of the system. However, existing CL-based methods still have the following shortcomings in dealing with the introduced knowledge: (1) For the knowledge view generation, they only perform simple data augmentation operations on KGs, resulting in the introduction of noise and irrelevant information, and the loss of essential information. (2) For the knowledge view encoder, they simply add the edge information into some GNN models, without considering the relations between edges and entities. Therefore, this paper proposes a Knowledge-based Recommendation with Contrastive Learning (KRCL) framework, which generates dual views from user–item interaction graph and KG. Specifically, through data enhancement technology, KRCL introduces historical interaction information, background knowledge and item–item semantic information. Then, a novel relation-aware GNN model is proposed to encode the knowledge view. Finally, through the designed contrastive loss, the representations of the same item in different views are closer to each other. Compared with various recommendation methods on benchmark datasets, KRCL has shown significant improvement in different scenarios.</p></div>","PeriodicalId":100605,"journal":{"name":"High-Confidence Computing","volume":"3 4","pages":"Article 100151"},"PeriodicalIF":0.0,"publicationDate":"2023-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50193403","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"DBT-PDP: Provable data possession with outsourced data batch transfer based on blockchain","authors":"Chengming Yi , Hua Zhang , Weiming Sun , Jun Ding","doi":"10.1016/j.hcc.2023.100152","DOIUrl":"10.1016/j.hcc.2023.100152","url":null,"abstract":"<div><p>In the scenario of large-scale data ownership transactions, existing data integrity auditing schemes are faced with security risks from malicious third-party auditors and are inefficient in both calculation and communication, which greatly affects their practicability. This paper proposes a data integrity audit scheme based on blockchain where data ownership can be traded in batches. A data tag structure which supports data ownership batch transaction is adopted in our scheme. The update process of data tag does not involve the unique information of each data, so that any user can complete ownership transactions of multiple data in a single transaction through a single transaction auxiliary information. At the same time, smart contract is introduced into our scheme to perform data integrity audit belongs to third-party auditors, therefore our scheme can free from potential security risks of malicious third-party auditors. Safety analysis shows that our scheme is proved to be safe under the stochastic prediction model and k-CEIDH hypothesis. Compared with similar schemes, the experiment shows that communication overhead and computing time of data ownership transaction in our scheme is lower. Meanwhile, the communication overhead and computing time of our scheme is similar to that of similar schemes in data integrity audit.</p></div>","PeriodicalId":100605,"journal":{"name":"High-Confidence Computing","volume":"4 2","pages":"Article 100152"},"PeriodicalIF":0.0,"publicationDate":"2023-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2667295223000508/pdfft?md5=4d6ff0d17b47c02265a60a329233f6b6&pid=1-s2.0-S2667295223000508-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135298556","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An effective digital audio watermarking using a deep convolutional neural network with a search location optimization algorithm for improvement in Robustness and Imperceptibility","authors":"Abhijit J. Patil , Ramesh Shelke","doi":"10.1016/j.hcc.2023.100153","DOIUrl":"10.1016/j.hcc.2023.100153","url":null,"abstract":"<div><p>Watermarking is the advanced technology utilized to secure digital data by integrating ownership or copyright protection. Most of the traditional extracting processes in audio watermarking have some restrictions due to low reliability to various attacks. Hence, a deep learning-based audio watermarking system is proposed in this research to overcome the restriction in the traditional methods. The implication of the research relies on enhancing the performance of the watermarking system using the Discrete Wavelet Transform (DWT) and the optimized deep learning technique. The selection of optimal embedding location is the research contribution that is carried out by the deep convolutional neural network (DCNN). The hyperparameter tuning is performed by the so-called search location optimization, which minimizes the errors in the classifier. The experimental result reveals that the proposed digital audio watermarking system provides better robustness and performance in terms of Bit Error Rate (BER), Mean Square Error (MSE), and Signal-to-noise ratio. The BER, MSE, and SNR of the proposed audio watermarking model without the noise are 0.082, 0.099, and 45.363 respectively, which is found to be better performance than the existing watermarking models.</p></div>","PeriodicalId":100605,"journal":{"name":"High-Confidence Computing","volume":"3 4","pages":"Article 100153"},"PeriodicalIF":0.0,"publicationDate":"2023-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S266729522300051X/pdfft?md5=eb13242a3c2d8236c8e27be37944fea5&pid=1-s2.0-S266729522300051X-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135298725","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"SUDM-SP: A method for discovering trajectory similar users based on semantic privacy","authors":"Weiqi Zhang , Guisheng Yin , Bingyi Xie","doi":"10.1016/j.hcc.2023.100146","DOIUrl":"https://doi.org/10.1016/j.hcc.2023.100146","url":null,"abstract":"<div><p>With intelligent terminal devices’ widespread adoption and global positioning systems’ advancement, Location-based Social Networking Services (LbSNs) have gained considerable attention. The recommendation mechanism, which revolves around identifying similar users, holds significant importance in LbSNs. In order to enhance user experience, LbSNs heavily rely on accurate data. By mining and analyzing users who exhibit similar behavioral patterns to the target user, LbSNs can offer personalized services that cater to individual preferences. However, trajectory data, a form encompassing various sensitive attributes, pose privacy concerns. Unauthorized disclosure of users’ precise trajectory information can have severe consequences, potentially impacting their daily lives. Thus, this paper proposes the Similar User Discovery Method based on Semantic Privacy (SUDM-SP) for trajectory analysis. The approach involves employing a model that generates noise trajectories, maximizing expected noise to preserve the privacy of the original trajectories. Similar users are then identified based on the published noise trajectory data. SUDM-SP consists of two key components. Firstly, a puppet noise location, exhibiting the highest semantic expectation with the original location, is generated to derive noise-suppressed trajectory data. Secondly, a mechanism based on semantic and geographical distance is employed to cluster highly similar users into communities, facilitating the discovery of noise trajectory similarity among users. Through trials conducted using real datasets, the effectiveness of SUDM-SP, as a recommendation service ensuring user privacy protection is substantiated.</p></div>","PeriodicalId":100605,"journal":{"name":"High-Confidence Computing","volume":"3 3","pages":"Article 100146"},"PeriodicalIF":0.0,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50191192","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jian Zhao , Wenqian Qiang , Zisong Zhao , Tianbo An , Zhejun Kuang , Dawei Xu , Lijuan Shi
{"title":"Research on medical data storage and sharing model based on blockchain","authors":"Jian Zhao , Wenqian Qiang , Zisong Zhao , Tianbo An , Zhejun Kuang , Dawei Xu , Lijuan Shi","doi":"10.1016/j.hcc.2023.100133","DOIUrl":"https://doi.org/10.1016/j.hcc.2023.100133","url":null,"abstract":"<div><p>With the process of medical informatization, medical diagnosis results are recorded and shared in the form of electronic data in the computer. However, the security of medical data storage cannot be effectively protected and the unsafe sharing of medical data among different institutions is still a hidden danger that cannot be underestimated. To solve the above problems, a secure storage and sharing model of private data based on blockchain technology and homomorphic encryption is constructed. Based on the idea of blockchain decentralization, the model maintains a reliable medical alliance chain system to ensure the safe transmission of data between different institutions; A privacy data encryption and computing protocol based on homomorphic encryption is constructed to ensure the safe transmission of medical data; Using its complete anonymity to ensure the Blockchain of medical data and patient identity privacy; A strict transaction control management mechanism of medical data based on Intelligent contract automatic execution of preset instructions is proposed. After security verification, compared with the traditional medical big data storage and sharing mode, the model has better security and sharing.</p></div>","PeriodicalId":100605,"journal":{"name":"High-Confidence Computing","volume":"3 3","pages":"Article 100133"},"PeriodicalIF":0.0,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50191193","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Consensus algorithm for medical data storage and sharing based on master–slave multi-chain of alliance chain","authors":"Yixian Zhang , Feng Zhao","doi":"10.1016/j.hcc.2023.100122","DOIUrl":"https://doi.org/10.1016/j.hcc.2023.100122","url":null,"abstract":"<div><p>The safe storage and sharing of medical data have promoted the development of the public medical field. At the same time, blockchain technology guarantees the safe storage and sharing of medical data. However, the consensus algorithm in the current medical blockchain cannot meet the requirements of low delay and high throughput in the large-scale network, and the identity of the primary node is exposed and vulnerable to attack. Therefore, this paper proposes an efficient consensus algorithm for medical data storage and sharing based on a master–slave multi-chain of alliance chain (ECA_MDSS). Firstly, institutional nodes in the healthcare alliance chain are clustered according to geographical location and medical system structure to form a multi-zones network. The system adopts master–slave multi-chain architecture to ensure security, and each zone processes transactions in parallel to improve consensus efficiency. Secondly, the aggregation signature is used to improve the practical Byzantine fault-tolerant (PBFT) consensus to reduce the communication interaction of consensus in each zone. Finally, an efficient ring signature is used to ensure the anonymity and privacy of the primary node in each zone and to prevent adaptive attacks. Meanwhile, a trust model is introduced to evaluate the trust degree of the node to reduce the evil done by malicious nodes. The experimental results show that ECA_ MDSS can effectively reduce communication overhead and consensus delay, improve transaction throughput, and enhance system scalability.</p></div>","PeriodicalId":100605,"journal":{"name":"High-Confidence Computing","volume":"3 3","pages":"Article 100122"},"PeriodicalIF":0.0,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50191221","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Spider monkey optimization based resource allocation and scheduling in fog computing environment","authors":"Shahid Sultan Hajam, Shabir Ahmad Sofi","doi":"10.1016/j.hcc.2023.100149","DOIUrl":"https://doi.org/10.1016/j.hcc.2023.100149","url":null,"abstract":"<div><p>Spider monkey optimization (SMO) is a quite popular and recent swarm intelligence algorithm for numerical optimization. SMO is Fission-Fusion social structure based algorithm inspired by spider monkey’s behavior. The algorithm proves to be very efficient in solving various constrained and unconstrained optimization problems. This paper presents the application of SMO in fog computing. We propose a heuristic initialization based spider monkey optimization algorithm for resource allocation and scheduling in a fog computing network. The algorithm minimizes the total cost (service time and monetary cost) of tasks by choosing the optimal fog nodes. Longest job fastest processor (LJFP), shortest job fastest processor (SJFP), and minimum completion time (MCT) based initialization of SMO are proposed and compared with each other. The performance is compared based on the parameters of average cost, average service time, average monetary cost, and the average cost per schedule. The results demonstrate the efficacy of MCT-SMO as compared to other heuristic initialization based SMO algorithms and Particle Swarm Optimization (PSO).</p></div>","PeriodicalId":100605,"journal":{"name":"High-Confidence Computing","volume":"3 3","pages":"Article 100149"},"PeriodicalIF":0.0,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50191223","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Trustworthy decentralized collaborative learning for edge intelligence: A survey","authors":"Dongxiao Yu , Zhenzhen Xie , Yuan Yuan , Shuzhen Chen , Jing Qiao , Yangyang Wang , Yong Yu , Yifei Zou , Xiao Zhang","doi":"10.1016/j.hcc.2023.100150","DOIUrl":"https://doi.org/10.1016/j.hcc.2023.100150","url":null,"abstract":"<div><p>Edge intelligence is an emerging technology that enables artificial intelligence on connected systems and devices in close proximity to the data sources. decentralized collaborative learning (DCL) is a novel edge intelligence technique that allows distributed clients to cooperatively train a global learning model without revealing their data. DCL has a wide range of applications in various domains, such as smart city and autonomous driving. However, DCL faces significant challenges in ensuring its trustworthiness, as data isolation and privacy issues make DCL systems vulnerable to adversarial attacks that aim to breach system confidentiality, undermine learning reliability or violate data privacy. Therefore, it is crucial to design DCL in a trustworthy manner, with a focus on security, robustness, and privacy. In this survey, we present a comprehensive review of existing efforts for designing trustworthy DCL systems from the three key aformentioned aspects: security, robustness, and privacy. We analyze the threats that affect the trustworthiness of DCL across different scenarios and assess specific technical solutions for achieving each aspect of trustworthy DCL (TDCL). Finally, we highlight open challenges and future directions for advancing TDCL research and practice.</p></div>","PeriodicalId":100605,"journal":{"name":"High-Confidence Computing","volume":"3 3","pages":"Article 100150"},"PeriodicalIF":0.0,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50191158","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yibin Xie , Lei Shi , Zhenchun Wei , Juan Xu , Yang Zhang
{"title":"An energy-efficient resource allocation strategy in massive MIMO-enabled vehicular edge computing networks","authors":"Yibin Xie , Lei Shi , Zhenchun Wei , Juan Xu , Yang Zhang","doi":"10.1016/j.hcc.2023.100130","DOIUrl":"https://doi.org/10.1016/j.hcc.2023.100130","url":null,"abstract":"<div><p>The vehicular edge computing (VEC) is a new paradigm that allows vehicles to offload computational tasks to base stations (BSs) with edge servers for computing. In general, the VEC paradigm uses the 5G for wireless communications, where the massive multi-input multi-output (MIMO) technique will be used. However, considering in the VEC environment with many vehicles, the energy consumption of BS may be very large. In this paper, we study the energy optimization problem for the massive MIMO-based VEC network. Aiming at reducing the relevant BS energy consumption, we first propose a joint optimization problem of computation resource allocation, beam allocation and vehicle grouping scheme. Since the original problem is hard to be solved directly, we try to split the original problem into two subproblems and then design a heuristic algorithm to solve them. Simulation results show that our proposed algorithm efficiently reduces the BS energy consumption compared to other schemes.</p></div>","PeriodicalId":100605,"journal":{"name":"High-Confidence Computing","volume":"3 3","pages":"Article 100130"},"PeriodicalIF":0.0,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50191218","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}