{"title":"A Machine Learning-Based Approach for Improving TCP Congestion Detection Mechanism in IoTs","authors":"Madeha Arif, Usman Qamar, Amreen Riaz","doi":"10.1109/FIT57066.2022.00035","DOIUrl":null,"url":null,"abstract":"TCP provides suboptimal performance when it comes to wireless or mobile networks. End-to-end connectivity with reliability is a big challenge in IoTs that have restricted memory and processor resources. Mainly, TCP was prepared for only wired networks and its performance will be ruined if we applied it on wireless and ad-hoc networks. IoTs have several issues related to TCP that need to be addressed and have been addressed in past. This paper addresses multiple issues that IoT enables an application to face during data transmissions with mobile nodes. Many researchers have proposed approaches based on certain algorithms and machine-learning techniques that have been summarized in this paper. A new algorithm has also been proposed that focuses on the differentiation of the data loss as congestion loss or random loss in a TCP-driven network transmission using an unsupervised machine learning approach. The proposed algorithm is both memory and computation efficient. It is self-evolving and adaptive as well.","PeriodicalId":102958,"journal":{"name":"2022 International Conference on Frontiers of Information Technology (FIT)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Frontiers of Information Technology (FIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FIT57066.2022.00035","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
TCP provides suboptimal performance when it comes to wireless or mobile networks. End-to-end connectivity with reliability is a big challenge in IoTs that have restricted memory and processor resources. Mainly, TCP was prepared for only wired networks and its performance will be ruined if we applied it on wireless and ad-hoc networks. IoTs have several issues related to TCP that need to be addressed and have been addressed in past. This paper addresses multiple issues that IoT enables an application to face during data transmissions with mobile nodes. Many researchers have proposed approaches based on certain algorithms and machine-learning techniques that have been summarized in this paper. A new algorithm has also been proposed that focuses on the differentiation of the data loss as congestion loss or random loss in a TCP-driven network transmission using an unsupervised machine learning approach. The proposed algorithm is both memory and computation efficient. It is self-evolving and adaptive as well.