{"title":"Self-Supervised Learning Framework With Under-Balanced Loss Optimization for Point of Care MRI Image Reconstruction in 6G-Driven Edge Networks","authors":"Yang Liu","doi":"10.1002/itl2.70035","DOIUrl":"https://doi.org/10.1002/itl2.70035","url":null,"abstract":"<div>\u0000 \u0000 <p>Self-supervised learning frameworks in the 6G-driven edge networks provide powerful instant MRI image diagnostic capabilities for the process of point of care. Although many deep learning self-supervised frameworks are used to train-related models to improve magnetic resonance imaging (MRI) image reconstruction, there is still room for improvement in model training convergence acceleration and MRI image reconstruction quality. To address the above issues, first, this article proposes a self-supervised learning framework, which combines the real-time computing power of the edge network driven by 6G networks to accelerate the training convergence of the MRI image reconstruction model and improve the quality of the reconstructed image. Second, the proposed framework innovatively introduces an under-balanced loss optimization structure and applies heterogeneous loss functions at different positions of the model. Finally, this article proposes AttentionFISTA-Net, which integrates the convolutional attention module into FISTA-Net to enhance the MRI image reconstruction effect. Experimental results on the IXI dataset compared with the traditional self-supervised network show that the proposed model performs better in the under-sampled dataset with acceleration rates of 4 and 8, respectively. The peak signal-to-noise ratio (PSNR) metric improves <span></span><math>\u0000 <semantics>\u0000 <mrow>\u0000 <mn>0.021</mn>\u0000 </mrow>\u0000 <annotation>$$ 0.021 $$</annotation>\u0000 </semantics></math> and <span></span><math>\u0000 <semantics>\u0000 <mrow>\u0000 <mn>0.61</mn>\u0000 </mrow>\u0000 <annotation>$$ 0.61 $$</annotation>\u0000 </semantics></math> respectively, and the structure similarity index measure (SSIM) metric improves <span></span><math>\u0000 <semantics>\u0000 <mrow>\u0000 <mn>0.5</mn>\u0000 <mo>*</mo>\u0000 <msup>\u0000 <mn>10</mn>\u0000 <mrow>\u0000 <mo>−</mo>\u0000 <mn>3</mn>\u0000 </mrow>\u0000 </msup>\u0000 </mrow>\u0000 <annotation>$$ {0.5}^{ast }{10}^{-3} $$</annotation>\u0000 </semantics></math> and <span></span><math>\u0000 <semantics>\u0000 <mrow>\u0000 <mn>8.2</mn>\u0000 <mo>*</mo>\u0000 <msup>\u0000 <mn>10</mn>\u0000 <mrow>\u0000 <mo>−</mo>\u0000 <mn>3</mn>\u0000 </mrow>\u0000 </msup>\u0000 </mrow>\u0000 <annotation>$$ {8.2}^{ast }{10}^{-3} $$</annotation>\u0000 </semantics></math>, respectively.</p>\u0000 </div>","PeriodicalId":100725,"journal":{"name":"Internet Technology Letters","volume":"8 4","pages":""},"PeriodicalIF":0.9,"publicationDate":"2025-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144190990","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chintureena Thingom, Martin Margala, S. Siva Shankar, Prasun Chakrabarti, R. G. Vidhya
{"title":"Enhanced Task Scheduling in Cloud Computing Using the ESRNN Algorithm: A Performance-Driven Approach","authors":"Chintureena Thingom, Martin Margala, S. Siva Shankar, Prasun Chakrabarti, R. G. Vidhya","doi":"10.1002/itl2.70037","DOIUrl":"https://doi.org/10.1002/itl2.70037","url":null,"abstract":"<div>\u0000 \u0000 <p>One key component of cloud computing that significantly affects performance is task scheduling. Quality of Service (QoS) in networking and the burgeoning information processing economy have spurred interest in the dynamic task scheduling challenge worldwide. Task scheduling is a complicated topic that has been categorized as NP-hard. It is also more difficult to strike a balance and reap the rewards of every facet of cloud computing because the majority of activities in complex environments are frequently managed using dynamic online task scheduling. This manuscript presents an approach for task scheduling in cloud computing technology for enhancing efficiency in enterprise environments. The proposed algorithm is an Effective Residual Self-Attention Recurrent Neural Network (ESRNN) Algorithm. The proposed method's primary goal is to improve system reliability, enhance efficiency, and provide better load standard deviation, make span, and throughput. The issue of managing Directed Acyclic Graph (DAG) jobs in a cloud computing context is resolved by the ESRNN algorithm. The performance of the proposed technique is implemented in the MATLAB platform and is compared with various existing approaches. The proposed approach determines better outcomes compared to existing methods such as Data aware, First-Come-First-Served (FCFS) and Round Robin (RR). In the existing method, throughput is 1, 1.2, 1.4 s, and then the proposed method throughput is 1.6 s. Based on the results, it can be concluded that the proposed approach has higher throughput compared to existing techniques.</p>\u0000 </div>","PeriodicalId":100725,"journal":{"name":"Internet Technology Letters","volume":"8 4","pages":""},"PeriodicalIF":0.9,"publicationDate":"2025-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144191090","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Joint Optimization of 6G Slicing and AR/VR for Teaching Quality Enhancement in Virtual Applications","authors":"Hongzhi Wei","doi":"10.1002/itl2.70036","DOIUrl":"https://doi.org/10.1002/itl2.70036","url":null,"abstract":"<div>\u0000 \u0000 <p>6G has promoted the connection of education, while VR/AR has visualized the education view. Now, with the advent of the educational informational 2.0 era, it is imperative for educators to embrace these novel opportunities brought by 6G and AR/VR. This is crucial to advance the level of educational informatization and enhance teaching outcomes. In this work, we propose a joint strategy based on 6G slicing and VR/AR technologies to enhance the teaching quality in virtual applications. In particular, by using the machine learning-based 6G slicing, we build the real-time and personalized AR/VR environment, in which the policy-driven AR/VR laboratory architecture aligns national strategies with academic transformation. Besides, we also introduce adaptive features to enable the dynamic AR/VR adjustment with intelligent feedback. Via comprehensive experiments and analysis, the paper shows that the transformative impact of 6G and VR/AR technology on virtual experiment teaching in higher education achieves higher performance than the state-of-the-art methods. It also suggests innovative teaching practices and directions for multi-node teaching models. Furthermore, we also enhance the personalization and adaptability of these teaching models, in which we analyze student learning patterns, identify areas of strength and weakness, and recommend tailored learning paths. The results indicate that a high teaching score is achieved with a study experience of more than 70% improved on average.</p>\u0000 </div>","PeriodicalId":100725,"journal":{"name":"Internet Technology Letters","volume":"8 4","pages":""},"PeriodicalIF":0.9,"publicationDate":"2025-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144190991","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Evolutionary Intelligent Large Model-Driven Heterogeneous Traffic Analysis in Wireless Communication Networks","authors":"Huanhuan Liu, Rui Sun, Yu Ji, Mengzhu Lu","doi":"10.1002/itl2.70024","DOIUrl":"https://doi.org/10.1002/itl2.70024","url":null,"abstract":"<div>\u0000 \u0000 <p>Traditional traffic analysis methods, which often assume homogeneous patterns, struggle to capture the intricate relationships between different traffic types and their temporal dependencies. This limitation necessitates advanced approaches that can effectively handle the multi-dimensional nature of modern network traffic while maintaining computational efficiency. This paper proposes evolutionary large model gray wolf optimization (GWO) for heterogeneous traffic networks (ELGWO-HTN), a novel framework that combines evolutionary computation with intelligent large models for heterogeneous traffic analysis. The approach integrates enhanced GWO with adaptive parameter tuning mechanisms, enabling efficient navigation of high-dimensional parameter spaces. Through experimentation in real wireless network environments, ELGWO-HTN demonstrates superior performance across multiple metrics, achieving 94.2% prediction accuracy for video streaming and 93.5% for data traffic, while reducing training time by 41.7% compared to baseline methods.</p>\u0000 </div>","PeriodicalId":100725,"journal":{"name":"Internet Technology Letters","volume":"8 4","pages":""},"PeriodicalIF":0.9,"publicationDate":"2025-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144190988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Association Problem Between Multiple Jammers and Multiple Wardens in Covert Communication Networks With TCIPC","authors":"Fanfeng Shi, Xinfeng Zhu","doi":"10.1002/itl2.70049","DOIUrl":"https://doi.org/10.1002/itl2.70049","url":null,"abstract":"<div>\u0000 \u0000 <p>Network survivability refers to the capability of the networks to maintain a guaranteed quality of service (QoS) in the presence of malicious attacks and node failures. The covert wireless communication (CWC) technique can improve the survivability of wireless Internet of Things (IoT) networks by hiding the desired signals in background noises. In this paper, the CWC scheme for the multiple jammers and multiple wardens networks is investigated, where the truncated channel inversion power control (TCIPC) is applied. The outage probability is utilized to evaluate the communication effectiveness, and the detection error probability (DEP) is used to evaluate the covertness. Then, the association problem between the multiple jammers and the multiple wardens is studied. The optimal association problem is formulated as a min-max integer programming problem to minimize the outage probability between Alice and Bob, subject to the DEP constraints, which are hard to solve. It is demonstrated that the genetic algorithm is a valid method to solve the proposed min-max integer programming problem, and the performances are analyzed.</p>\u0000 </div>","PeriodicalId":100725,"journal":{"name":"Internet Technology Letters","volume":"8 4","pages":""},"PeriodicalIF":0.9,"publicationDate":"2025-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144179258","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Smita Bhore, Narambunathan Arunachalam Natraj, V. Suresh, M. S. Mohamed Mallick, Sunil Lavadiya
{"title":"Nadam-Swarm Based Adaptive Routing Protocol Using Graph Equivariant Network for Seamless Data Transmission in 5G-Connected Wireless Sensor Networks","authors":"Smita Bhore, Narambunathan Arunachalam Natraj, V. Suresh, M. S. Mohamed Mallick, Sunil Lavadiya","doi":"10.1002/itl2.70048","DOIUrl":"https://doi.org/10.1002/itl2.70048","url":null,"abstract":"<div>\u0000 \u0000 <p>Wireless Sensor Networks (WSNs) have transformed data transmission methodologies by merging with 5G technology to provide ultra-reliable, low-latency, and energy-efficient data transfers. Nonetheless, owing to the intricacies involved in attaining dynamic network topologies, constrained resource management, and scalability, there is a want for improved routing methodologies to optimize 5G-enabled wireless sensor networks. This study introduces the “Nadam-Swarm based Adaptive Routing Protocol using Graph Equivariant Network for Seamless Data Transmission in 5G-Connected Wireless Sensor Networks” (NR-GE-BiSO) as a proficient solution for efficient data transmission. The protocol utilizes a multi-tiered approach: the Nadam-based Random Search Algorithm (NR-SA) dynamically allocates clustering head nodes to balance the load depending on the residual energy and traffic density of the nodes inside the network. Graph Equivariant Quantum Neural Networks (GE-QNN) provide a Wireless Sensor Network (WSN) structural graph to identify optimal routing pathways based on variations within the WSN, facilitating effective data delivery with minimal power consumption. The Bipolar Swarm Optimizer (BiSO) enhanced the routing process by determining the shortest, most energy-efficient routes with minimal latency and energy expenditure. Simulation results validate the efficacy of NR-GE-BiSO, achieving metrics: 99.92% throughput and a 99.88% packet delivery ratio with 99.01% reduction of routing overhead outperforming the existing methods. The findings indicated that the protocol facilitates energy-efficient, scalable, and reliable communication. By integrating 5G capabilities with advanced routing algorithms, NR-GE-BiSO achieves a heightened degree of wireless sensor network efficiency, enabling innovative applications in smart cities, industrial IoT, and environmental domains.</p>\u0000 </div>","PeriodicalId":100725,"journal":{"name":"Internet Technology Letters","volume":"8 4","pages":""},"PeriodicalIF":0.9,"publicationDate":"2025-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144171208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Tiny Machine Learning for Real-Time Anomaly Detection of Self-Media Public Opinion in Edge-Cloud-Cooperation Campus Networks","authors":"Shiqi Li","doi":"10.1002/itl2.70038","DOIUrl":"https://doi.org/10.1002/itl2.70038","url":null,"abstract":"<div>\u0000 \u0000 <p>Real-time anomaly detection in self-media public opinion requires lightweight solutions to address the latency and multimodal complexity challenges of campus network ecosystems. This article proposes a tiny machine learning framework for edge-cloud-cooperation campus networks, enabling efficient detection of opinion anomalies through distributed computation. The architecture combines edge-native micro-model compression with cloud-assisted federated verification, achieving three key innovations: (1) On-device micro-graph neural networks (GNNs) deployed at edge nodes for low-latency pattern recognition in terahertz multimedia streams; (2) a dual-phase anomaly engine leveraging contrastive semantic alignment and adaptive influence analysis to capture cross-modal inconsistencies; (3) dynamic knowledge distillation that reduces model footprints to 8 MB while preserving 91% precision and 87% recall on a 120,000 post dataset from 15 universities. Experimental results demonstrate 120 ms average inference latency with 68% lower computation overhead than centralized baselines, accelerating emergency response by 3.25× through edge-cloud task partitioning. The framework maintains 74% energy efficiency in continuous operation, proving the viability of tiny machine learning paradigms for intelligent campus governance without relying on next-generation communication standards.</p>\u0000 </div>","PeriodicalId":100725,"journal":{"name":"Internet Technology Letters","volume":"8 4","pages":""},"PeriodicalIF":0.9,"publicationDate":"2025-05-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144148563","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Multiscale Transformer Framework for Optimizing Educational Resource Transmission in Preschool Wireless Networks","authors":"Junqing Fan","doi":"10.1002/itl2.70043","DOIUrl":"https://doi.org/10.1002/itl2.70043","url":null,"abstract":"<div>\u0000 \u0000 <p>This paper proposes EduTransNet, a novel framework combining cross-scale window attention mechanisms with separable spatio-temporal attention to enhance network transmission efficiency and resource utilization in preschool wireless networks. The framework jointly models temporal dependencies and long-range network node relationships, incorporating multiresolution optimization strategies for adaptive resource allocation. Experimental results on the EdNet dataset, containing over 131 million student interactions, demonstrate that EduTransNet achieves significant improvements with a PSNR of 37.13 dB and SSIM of 0.978, surpassing existing methods by 2.3 dB and 0.008, respectively. The framework shows particular strength in handling dynamic educational content delivery scenarios with multiple concurrent users while maintaining a low latency of 160 ms.</p>\u0000 </div>","PeriodicalId":100725,"journal":{"name":"Internet Technology Letters","volume":"8 4","pages":""},"PeriodicalIF":0.9,"publicationDate":"2025-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144118152","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Deep Learning Approach for Malware Detection in IoT Binaries Using Spatial and Temporal Patterns","authors":"M. Nandish, Jalesh Kumar","doi":"10.1002/itl2.70032","DOIUrl":"https://doi.org/10.1002/itl2.70032","url":null,"abstract":"<div>\u0000 \u0000 <p>The proliferation of malware in the Internet of Things (IoT) environment poses significant challenges to IoT security due to the heterogeneity and resource constraints of IoT devices. Traditional malware detection methods, which rely heavily on individual features, static analysis, and raw byte sequences, suffer from performance limitations and are not effective against evolving threats. The proposed work introduces a novel deep learning-based malware detection model that integrates Convolutional Neural Networks (CNNs) and Gated Recurrent Units (GRUs) to learn spatial and temporal representations from binary features. CNN extracts spatial patterns from static binary representations, while GRU extracts sequential dependencies in dynamic binary features, enabling the detection of complex malware behaviors. To further enhance detection efficiency, a feature selection mechanism is incorporated to identify the most relevant spatial–temporal features, reducing training time while maintaining high detection accuracy. The proposed approach effectively combines static and dynamic feature representations to train a robust classifier capable of detecting sophisticated malware patterns. Experimental evaluations on an IoT malware dataset demonstrate the efficacy of the proposed model, achieving an average detection accuracy of 99.33%, significantly outperforming traditional methods. The results also show the model's robustness against obfuscation techniques, with a substantial reduction in the false positive rate (FPR).</p>\u0000 </div>","PeriodicalId":100725,"journal":{"name":"Internet Technology Letters","volume":"8 4","pages":""},"PeriodicalIF":0.9,"publicationDate":"2025-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144118151","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Geetanjali Rathee, Anissa Cheriguene, Chaker Abdelaziz Kerrache, Carlos T. Calafate
{"title":"A Secure and Trusted Communication Solution for Web 3.0 Based on Edge Intelligence","authors":"Geetanjali Rathee, Anissa Cheriguene, Chaker Abdelaziz Kerrache, Carlos T. Calafate","doi":"10.1002/itl2.70007","DOIUrl":"https://doi.org/10.1002/itl2.70007","url":null,"abstract":"<div>\u0000 \u0000 <p>The rise of AI has positioned edge computing as a pivotal domain for deploying machine learning technologies, fostering agile processing, and enhancing network robustness and decision-making capabilities. This paper addresses the underexplored aspects of DDoS and phishing attacks, and precise decision-making at network edge devices within blockchain-based frameworks. The contribution lies in proposing an incentive-based security mechanism to divert intruders from genuine routes. Legitimate devices conducting accurate decision-making are rewarded, enticing their participation in identifying false devices. A honeypot intrusion detection system attracts false devices, and real-time trust computation monitors communication devices. This approach is analyzed under security threats and network delays, demonstrating its efficacy compared to existing methods in safeguarding edge computing environments.</p>\u0000 </div>","PeriodicalId":100725,"journal":{"name":"Internet Technology Letters","volume":"8 4","pages":""},"PeriodicalIF":0.9,"publicationDate":"2025-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144125959","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}